Abstract

To summarize the data in which we have at our disposal, we have a subset of a study, conducted by Steinmetz et al. (2019), in which 10 mice had experiments performed on them over 39 different sessions, with each session containing hundreds of trials. The neural activity of these mice was investigated after being exposed to some some stimuli, at varying levels. Multiple variables, which is explained later, were utilized to try and predict whether an individual trial would be deemed a success or not. For the purposes of our project, we will only be looking at the first 18 sessions, which involve four mice: Cori, Frossman, Hence, and Lederberg.

Introduction

Using these 18 sessions, we wish to build a predictive model which predicts the feedback or outcome of every trial using our neural activity data and stimuli information. Each session has a different amount of trials, ranging from around 100 trials to 400 trials. Each trial consisted of the mouse being presented with two different levels of stimuli on their left and right side, and this trial would only last about 4 seconds every time. Each trial would produce different neural activity in the form of spikes.

The real-world motivations of our results are very promising considering what we are dealing with. We can gain insights into the way in which mice make their decisions, and how they process things in their visual cortex. How the brain of a mouse processes information and makes decisions is vital knowledge, as mice and humans have been related for such a long time. With mice being a good model and test dummy for humans, we can use the results of our data to apply this our understanding of how humans think. There are a lot of different domains which could be influenced and intrigued by the findings of this data.

There are a few hypotheses that I can make based on my understanding of this data, before diving into exploring the data-set. Firstly, I think that the brighter the contrast level is per given trial, the more successful the outcome will be. The mice will be more responsive to the higher levels of contrast as opposed to smaller and nonexistent levels. Secondly, I believe that mice will perform better at the back end of the trials. As the mice get more used to the trials in which they are apart of, they will have more consistently successful outcomes in the last few trials conducted. Finally, the last hypothesis I predict is that the combination of multiple sessions together will result in a better representation of the trials as a whole. Our model will be improved by this combination of data, and the results which are yielded will be better.

Background

For all 18 of the sessions, we have 6 variables for each individual trial. Some of the key variables to our project which will be mentioned frequently throughout this report are given an explanation below:

feedback_type: the type of feedback that the mouse receives based off of the outcomes of what they decided to do. 1 represents a success and -1 represents a failure.

spks: how neural activity is measured per trial. the number of neuron spikes which we see in the visual cortex of the mice’s brain.

contrast_left: the contrast level of the left stimulus, varying from {0, 0.25, 0.5, 1}. 0 represents no stimuli.

contrast_right: the contrast level of the right stimulus, varying from {0, 0.25, 0.5, 1}. 0 represents no stimuli.

The mice were given positive reinforcement as feedback if they chose the correct side, left or right, based off of which side had a higher contrast level.

The variable representing feedback_type is our outcome in the model that we are going to build. We are looking to construct a model which can predict if the mouse responds successfully or unsuccessfully in every trial, based off the decisions that the mice are making due to the contrast levels. We are trying to predict the type of feedback using the spike trains of neurons and the stimuli contrasts.

We start with data analysis of this large data-set, in order to make the best model for prediction. We need to describe our data structures across the 18 different sessions using the variables like the number of neurons, the number of trials, stimuli conditions, and feedback types. In each trial, we need to explore the neural activities of the mice, and use that to compare the changes in activity across trials. Finally, homogeneity and heterogeneity needs to be taken into consideration when looking across the sessions and mice. Visualizations will be an important aspect of our analysis, as different plots and charts will be able to tell the story of our study at a macro level.

The results which we will be generating from our data analysis will have an impact on our understanding when it comes to the underlying processes of visual perception and the decision-making that mice exemplify. We can look at the results and make assumptions on how mice respond to levels of contrast and the affects that may come with feedback.

With this analyzed data, we will look to extract the patterns we see across sessions and also potentially address the differences we see between sessions, to combine the data across the trials with a specific approach depending on the criteria. Ultimately, we are combining the best data across the 18 sessions to improve the prediction model.

When we construct our prediction model, we will test it’s performance at the end by evaluating it on two test sets of 100 trials. Our model will be able to predict the outcome or feedback type of a trial.

Exploratory Analysis

Lets start to explore the data so that we have a full understanding of what we are dealing with in order to make predictions and form our model. Here is a table which summarizes the data well, and can give a great overview of what we are working with.

Table 1: this represents the summary statistics of each of the 18 sessions, across 6 important variables.

We see the four different mice that the 18 different sessions all deal with, with each mouse being labeled with its name under the variable mouse_name. Here’s a description of the rest of the variables:

date_exp: just represents which day every session was performed.

n_brain_area: provides the number of different brain areas where the neurons come from.

n_neurons: this variable shows how many neurons were observed for that given session.

n_trials: this represents the exact amount of trials that were ran in the given session. We see that the number isn’t the same for every session, and that the number of trials varies a decent amount.

success_rate: this variable is a representation of the proportion of how many trials resulted in the mouse making the correct, successful decision regarding the contrast levels.

We can see that session 10 with Hench had the most trials ran, at 447, while the first trail had the least amount of trials. Likewise, this first trial also had the lowest success rate at 60.5%. Our two highest success rates are in the low 80’s, at 80.6% and 83%, and we find these ratios on the last two sessions of the data, session 17 and 18 respectively. The mouse Lederberg seemed to also have the highest rates of success on the most consistent basis. This mouse has the most sessions out of the four mice in this 18 sessions. It’s interesting to keep this in mind, as success rate is an important variable to measure as we continue to look into the data.

Although it’s easy to look at the 18 sessions from this summarized level, let’s dive into specific trials in a session to further explore and improve upon our understanding of the data we have. I’ll choose the third session with Cori, as this session seems to have a good number of neurons, trials, and a reasonable success rate, with all three of those variables not being too extreme compared to the other sessions. The 11 brain areas involved in the third session are CA1, DG, LP, MG, MRN, NB, POST, root, SPF, VISam, VISp. These are the brain areas where 619 total neurons lie for session 3. These two variables are key when we want to explore neural activities. As a reminder, the neural activity is stored in every trial of our data in the form of spike trains (spks variable). These spike trains are a collection of time stamps which correspond to the neuron firing. Only the spike trains of neurons from the onset of the stimuli to 0.4 seconds post-onset are under what we are viewing in this project.

When looking at the first trial, we can take the average number of spikes across neurons in each brain area. This is what I’ve chosen to do in order to gauge the neural activity across different trials. We can look at specific areas of the brain and compare the average number of spikes from trial to trial. Also, comparing the average number of spikes in different brain areas gives us an idea of how certain regions of the brain may be superior to others when it comes to neural activity. The breakdown for the first trial is as follows.

Table 2: the average number of spikes for each brain area in session 3, trial 1.

Now look at the breakdown for the 2nd trial.

Table 3: the average number of spikes for each brain area in session 3, trial 2.

The final breakdown I have here is the 177th trial.

Table 4: the average number of spikes for each brain area in session 3, trial 177.

We can see the attained average of spikes across neurons for each brain area in the three different trials. MRN, SPF, and LP consistently seem to be the three regions of the brain with the highest level of spikes on average.

A summary table can be created to display the average number of spikes across neurons for all 11 areas of the brain for every single trial in session 3. In addition to the number of spikes, we can also see the feedback (whether success or not), and the contrast levels for left and right for every single trial. This table can give great insight into the effects that contrast level, and neural activity in certain brain areas may affect the chances of a successful trial. We can look for correlation between a success and what factors are consistently occurring to potentially lead to those successes.

Table 5: summary of average number of spikes for each brain area in every trial of session 3.

Let’s visualize this data to get an even better understanding.

Visualization 1: for each trial in session 3, the average spike count is visualized by brain area.

Our plot shows the average spike count of each area of the brain across all 228 trials in session 3. MRN is the area of the brain which has the greatest amount of spikes on average by a good amount. The green points at the top represent it’s distribution, as all eleven brain regions are color coded in the top right legend. What we can learn from this plot is how each region of the brain is performing neurally when it comes to the counts of spikes on average. We can determine which brain areas have greater neural activity than others. Most of the areas are consistent in their distribution as well, indicating that there isn’t much variation in how the number of spikes varies on average. The SPF area does stand out in this regard, as the count of spikes does fluctuate the most, but for the most part, this brain area has the second best neural activity. I think that we can improve this plot by incorporating the data regarding feedback. Whether a trial is successful or not is an important aspect of our research, and this is missing from our plot. The contrast levels aren’t visible in this plot, and it would be interesting to see what the levels of contrast were when we saw high spike counts, and low spike counts.

Now let’s take an even deeper dive into the individual trials itself. After looking at all the trials of session 3, we can look into the first and second trial to look for how the spike occurrences appear over time for each neuron in the first trial. This plot will give us additional insight, as we can also investigate the different types of feedback per trial. Remember, feedback 1 represents a success, while feedback -1 is not a success. Since this is such a key to the questions we have, and the models we will construct, including a plot which indicates the feedback type is very helpful with providing information that can assist us in creating a predictive model. Neural activities of both types of feedback in every trial is nice data to have visualized.

Visualization 2: two raster plots which show every occurrence of a spike for a given trial.

Each dot in this plot represents when a spike happened for every neuron. The data is again color coded by the region of the brain in which the spike occurred. We can see the distribution of the neural activity for the first and second trial based off of this plot. We also notice that the first two trials were both successful, and how some brain regions are more prevalent than others.

In trial 1, the neural activities can be described as widespread. Many different areas of the brain are all involved, indicating the role they play in the activity of spikes. The pattern of certain areas of the brain in particular are consistent throughout, such as MRN and DG. Looking at the trial 2, we see that the activity is pretty similar. Since the trials were so close to each other in time, and the trials were both also successful, it’s not a surprise that they have similar distributions when it comes to occurrence of spikes. For both trials, the section on the plot between the two ‘green’ brain areas seem to be the emptiest. In between MG and MRN, the occurrence of spikes isn’t as prevalent. This is interesting to notice, as a few regions of the brain don’t seem to be as involved, even though we know that this trial was a successful one. The variation in colors and data point makes sense considering the result of this trial, and its fair to predict that a plot of a trial which was a failure may not be as colorful and filled.

Let’s compare these plots to an unsuccessful trial to test this theory.

Visualization 3&4: two more raster plots which show every occurrence of a spike for a given trial.

We can see from both of these trials that there is more white area in the plot, indicating that the spike occurrence for the neurons wasn’t as frequent. MRN and MGN are still strong brain areas when it comes to neural activity, but as a whole, this trial doesn’t see the neural activity that the successful trials above were seeing. It’s also important to note the regions of the brain which are very different from successful trial to unsuccessful trial, as neural activity in these regions may be key to determining the success of a trial. Trial’s 1 and 2 both saw good frequencies for CA1, DG, LP and VISam, however these 4 brain area’s tailed off in trials 227 and 228.

We can now focus on exploring homogeneity and heterogeneity across sessions and mice. First lets define and explain the process which we will be doing, and why its important. Basically, we will be looking for similarities and differences between the sessions we have at our disposal, and also between the four different mice that we have. Since we already have a plot representing the average number of spikes per brain area for session 3, lets do the same for some different sessions. We will choose sessions 2 and 4, since the second session is the some mouse as session 3, and session 4 has the same amount of affected brain regions. Lets start with session 2.

Table 6: summary of average number of spikes for each brain area in every trial of session 2.

Visualization 5: for each trial in session 2, the average spike count is visualized by brain area.

We can see that the average number of spikes per trial is much smaller for this session, compared to the 3rd session. The neural activity is very shortened in the second session compared to the third session, as there are six less brain areas being affected. The average count of spikes only maxes out at just above 2 spikes, which is much less than the third session. We see an average spike count as high as 7 in the third session. We should keep in mind that this is the same mouse, so it is interesting to see how his neural activity differs from one session to the next.

Table 7: summary of session 2 values for each variable.

Table 8: summary of session 3 values for each variable.

Looking at the summaries of both sessions also gives interesting information. The second session summary is on top, with the summary of the third session on the bottom. We see that the second session had a contrast level of 0 for every trial on the right side. Likewise, the third session had a contrast level of 1 for every trial on the right side. This is an important difference between sessions to note, and may be another reason why session three seemed to have more neural activity. The success rate was also higher for the third session, so this drastic change in contrast levels in the right hand side might be an important factor.

Table 9: summary of session 4 values for each variable.

Visualization 4: for each trial in session 4, the average spike count is visualized by brain area.

Here we see the plot for average spike counts in session 4, across the 11 brain areas, and the summary of how each region is getting the number of spikes on average. Again, this session has a decreased neural activity compared to the 3rd session. It’s more similar to the second session when you see the average number of spikes that were recorded per trial. So we can view a good amount of differences that exist between sessions. There is also similarities between sessions, with the distribution of spikes, and the contrast levels matching each other at times. However it can be hard to compare the sessions accurately due to the differences that they have in certain variables, like the difference in number of neurons, the amount of trials, and the brain areas affected. This can lead into the next section of the project, as we can start to combine data sets to get a better understanding of it as a whole.

Data Integration

From what we found and discussed earlier, I believe that integrating our data by clustering is a valid approach. We can utilize some of the similarities that we saw among the sessions and group them together into 3 clusters. A variable that we have been really interested with is the number of spike counts per trial. We’ve been looking at the average amount of spikes per trial, and we can create a vector with the average spike count for every single trial that occurred across the 18 different sessions. Here is how this vector looks when visualized.

Visualization 5: A histogram of the distribution of average spike counts across all trials.

Visualization 6: A line plot of the distribution of average spike counts across all trials.

Visualization 7: A density plot of the distribution of average spike counts across all trials.

We can unpack a lot from these different plots. Starting with the histogram, we see that the average spike count distribution is slightly skewed to the right. The highest frequencies of average spike counts were between 0.75 and 1.5. Looking at the line plot, we can see how the average spike counts differ from session to session. The groupings indicate which trials make up the first session, second session, and so on. We can see from here that some sessions have a high average count for spikes, will most sessions have their average spike count at 2 spikes and under. Finally, our density plot indicates which average spike counts are most prevalent.

Now that we see some of the patterns that occur when looking at all average spike counts of the trials, lets perform the clustering. Three clusters seems like a good number considering the distribution of average spike counts across trials, as the average number of spikes rarely exceeds three. Here is a scatter plot of the clusters:

Visualization 8: The distribution of average spike counts across all trials, split into three clusters.

The clustering method comes out really well, as the three groups are split well. We see that the first cluster, labeled in black, represents all trials where the average number of spikes was roughly under 1.19 spikes. The second cluster in red represents all trials where the average number of spikes was between 1.19 and 1.84 approximately. Finally, our third cluster in green is all the trials where the average number of spikes was above around 1.84 spikes. These three groupings can allow us to summarize this large data set and acknowledge some of the patterns that are shared across all trials. This can tie into our prediction model, as we will use these grouping along with the outcome types and average number of spikes per trial to build our model.

Predictive Modeling

I decided to use the logistic regression model to predict out outcome type for each trial, whether it is a successful one or not. The accuracy of our test is 71%, at 0.7100964 to be exact.

Prediction Performance on Test Sets

Now that we have access to the test data sets, lets test our prediction model against them. The two data sets are from sessions 1 and 18 respectively, which are two sessions we haven’t really looked at in this report. Applying our logistical regression model to these two tests sets, we see that the accuracy of predicting the correct outcome type is 72.5%. This is slightly better than the previous accuracy rate we got, and this is a good sign.

Lets also look at the confusion matrix.

Our confusion matrix displays the breakdown of successful and unsuccessful feedback of each cluster. We see that success is more likely per trial, and that we had over 60% success rates for both session 1 and 18.

Discussion

To conclude the actions and subsequent findings of this report, we started by analyzing 18 different sessions of hundreds of trials to find some patterns in neural activity of mice. A predicted model was created in order to predict whether the outcome of a trial would be successful or not. A logistic regression model was utilized, with the usage of clustering and the average count of spikes per trial, in order to come up with the best prediction for feedback type. We ultimately got a model with 71% accuracy in these predictions.

Acknowledgement

I’d like to acknowledge the TA’s and the discussions that they had which assisted me in this project. The consulting sessions were good check-ins throughout the course which assisted on my understanding of the concepts developed in this project.

Code Appendix

library(dplyr)
## 
## Attaching package: 'dplyr'
## The following objects are masked from 'package:stats':
## 
##     filter, lag
## The following objects are masked from 'package:base':
## 
##     intersect, setdiff, setequal, union
library(ggplot2)
library(knitr)
library(tidyverse)
## ── Attaching core tidyverse packages ──────────────────────── tidyverse 2.0.0 ──
## ✔ forcats   1.0.0     ✔ stringr   1.5.0
## ✔ lubridate 1.9.2     ✔ tibble    3.2.1
## ✔ purrr     1.0.1     ✔ tidyr     1.3.0
## ✔ readr     2.1.4
## ── Conflicts ────────────────────────────────────────── tidyverse_conflicts() ──
## ✖ dplyr::filter() masks stats::filter()
## ✖ dplyr::lag()    masks stats::lag()
## ℹ Use the conflicted package (<http://conflicted.r-lib.org/>) to force all conflicts to become errors
library(glmnet)
## Loading required package: Matrix
## 
## Attaching package: 'Matrix'
## 
## The following objects are masked from 'package:tidyr':
## 
##     expand, pack, unpack
## 
## Loaded glmnet 4.1-7
library(caTools)
library(caret)
## Loading required package: lattice
## 
## Attaching package: 'caret'
## 
## The following object is masked from 'package:purrr':
## 
##     lift
session <- list()

for (i in 1:18) {
  session[[i]] <- readRDS(paste("~/Downloads/sessions/session", i, ".rds", sep=""))
}

n.session=length(session)

meta <- tibble(
  mouse_name = rep('name',n.session),
  date_exp =rep('dt',n.session),
  n_brain_area = rep(0,n.session),
  n_neurons = rep(0,n.session),
  n_trials = rep(0,n.session),
  success_rate = rep(0,n.session)
)

for(i in 1:n.session){
  tmp = session[[i]];
  meta[i,1]=tmp$mouse_name;
  meta[i,2]=tmp$date_exp;
  meta[i,3]=length(unique(tmp$brain_area));
  meta[i,4]=dim(tmp$spks[[1]])[1];
  meta[i,5]=length(tmp$feedback_type);
  meta[i,6]=mean(tmp$feedback_type+1)/2;
}

kable(meta, format = "html", table.attr = "class='table table-striped'",digits=3)
mouse_name date_exp n_brain_area n_neurons n_trials success_rate
Cori 2016-12-14 8 734 114 0.605
Cori 2016-12-17 5 1070 251 0.633
Cori 2016-12-18 11 619 228 0.662
Forssmann 2017-11-01 11 1769 249 0.667
Forssmann 2017-11-02 10 1077 254 0.661
Forssmann 2017-11-04 5 1169 290 0.741
Forssmann 2017-11-05 8 584 252 0.671
Hench 2017-06-15 15 1157 250 0.644
Hench 2017-06-16 12 788 372 0.685
Hench 2017-06-17 13 1172 447 0.620
Hench 2017-06-18 6 857 342 0.795
Lederberg 2017-12-05 12 698 340 0.738
Lederberg 2017-12-06 15 983 300 0.797
Lederberg 2017-12-07 10 756 268 0.694
Lederberg 2017-12-08 8 743 404 0.765
Lederberg 2017-12-09 6 474 280 0.718
Lederberg 2017-12-10 6 565 224 0.830
Lederberg 2017-12-11 10 1090 216 0.806
# how i got avg spike count for each brain area in session 3. i.s represents session number and i.t represents trial number. I changed i.t to equal 2 and 177 to get the list for those trials.
i.s=3

i.t=1  

spk.trial = session[[i.s]]$spks[[i.t]]
area=session[[i.s]]$brain_area

spk.count=apply(spk.trial,1,sum)

spk.average.tapply=tapply(spk.count, area, mean)

# function which gets average spike count for each brain area in specific session and trial. Use i.s to change between sessions, and number the trial. 
average_spike_area<-function(i.t,this_session){
  spk.trial = this_session$spks[[i.t]]
  area= this_session$brain_area
  spk.count=apply(spk.trial,1,sum)
  spk.average.tapply=tapply(spk.count, area, mean)
  return(spk.average.tapply)
}
# here, 1 represents the first trial. This number can be changed to view other trials. 
average_spike_area(1,this_session = session[[i.s]])
##      CA1       DG       LP       MG      MRN       NB     POST     root 
## 2.738095 4.058824 4.500000 2.722628 5.634146 1.674419 1.111111 1.166667 
##      SPF    VISam     VISp 
## 4.600000 2.061404 1.324561
n.trial=length(session[[i.s]]$feedback_type)
n.area=length(unique(session[[i.s]]$brain_area ))

# Adding feedback type,  the two contrasts, and the trial id to our data frame that contains the average spike counts for each area

trial.summary =matrix(nrow=n.trial,ncol= n.area+1+2+1)
for(i.t in 1:n.trial){
  trial.summary[i.t,]=c(average_spike_area(i.t,this_session = session[[i.s]]),
                          session[[i.s]]$feedback_type[i.t],
                        session[[i.s]]$contrast_left[i.t],
                        session[[i.s]]$contrast_right[i.s],
                        i.t)
}

colnames(trial.summary)=c(names(average_spike_area(i.t,this_session = session[[i.s]])), 'feedback', 'left contr.','right contr.','id' )

trial.summary <- as_tibble(trial.summary)

#Creating plot for avg spike counts in every trial in session 3, color coded by brain area. 
area.col=rainbow(n=n.area,alpha=0.7)
plot(x=1,y=0, col='white',xlim=c(0,n.trial),ylim=c(0.5,7.7), xlab="Trials",ylab="Average spike counts", main=paste("Spikes per area in Session", i.s))


for(i in 1:n.area){
  lines(y=trial.summary[[i]],x=trial.summary$id,col=area.col[i],lty=2,lwd=1)
  lines(smooth.spline(trial.summary$id, trial.summary[[i]]),col=area.col[i],lwd=3)
  }
legend("topright", 
  legend = colnames(trial.summary)[1:n.area], 
  col = area.col, 
  lty = 1, 
  cex = 0.8
)

#This plot displays the spike occurrences over time for each neuron in a specific trial, color coded by the brain area.
plot.trial<-function(i.t,area, area.col,this_session){
    
    spks=this_session$spks[[i.t]];
    n.neuron=dim(spks)[1]
    time.points=this_session$time[[i.t]]
    
    plot(0,0,xlim=c(min(time.points),max(time.points)),ylim=c(0,n.neuron+1),col='white', xlab='Time (s)',yaxt='n', ylab='Neuron', main=paste('Trial ',i.t, 'feedback', this_session$feedback_type[i.t] ),cex.lab=1.5)
    for(i in 1:n.neuron){
        i.a=which(area== this_session$brain_area[i]);
        col.this=area.col[i.a]
        
        ids.spike=which(spks[i,]>0) #seeing where spikes are
        if( length(ids.spike)>0 ){
            points(x=time.points[ids.spike],y=rep(i, length(ids.spike) ),pch='.',cex=2, col=col.this)
        }
      
            
    }
    
legend("topright", 
  legend = area, 
  col = area.col, 
  pch = 16, 
  cex = 0.8
  )
}
varname=names(trial.summary);
area=varname[1:(length(varname)-4)]
plot.trial(1,area, area.col,session[[i.s]])

varname=names(trial.summary);
area=varname[1:(length(varname)-4)]
par(mfrow=c(1,2))
plot.trial(1,area, area.col,session[[i.s]])
plot.trial(2,area, area.col,session[[i.s]])

plot.trial(227,area, area.col,session[[i.s]])
plot.trial(228,area, area.col,session[[i.s]])

#creating vector for avg spike cont of all 5081 trials, across the 18 sessions
avg_spikes <- numeric(0)

for (i in 1:length(session)) {
  this_session <- session[[i]]
  
  # Loop over trials in the current session
  for (j in 1:length(this_session$spks)) {
    spks_trial <- this_session$spks[[j]]
    total_spikes <- apply(spks_trial, 1, sum)
    avg_spikes <- c(avg_spikes, mean(total_spikes))
  }
}

avg_spikes
##    [1] 1.5817439 1.3119891 1.8446866 1.3814714 1.4250681 1.0940054 2.1021798
##    [8] 1.7043597 1.5095368 1.7316076 1.5040872 1.9046322 1.7643052 1.4591281
##   [15] 1.7234332 1.3882834 1.3637602 2.0544959 1.9209809 2.1049046 1.3051771
##   [22] 1.2561308 1.3474114 1.5967302 1.6771117 1.9604905 1.9427793 2.3392371
##   [29] 1.0967302 1.8991826 1.7057221 1.3365123 1.2629428 1.7356948 1.8678474
##   [36] 1.8542234 1.5476839 1.5122616 1.1267030 1.8256131 1.7588556 1.3310627
##   [43] 1.8038147 1.8596730 1.6689373 2.0504087 1.7111717 1.1866485 1.3188011
##   [50] 1.3079019 1.8841962 1.8651226 1.4250681 1.6376022 1.4822888 2.0558583
##   [57] 1.3896458 1.2370572 1.2574932 2.0163488 1.9918256 1.5081744 1.3228883
##   [64] 1.5408719 1.4441417 1.4182561 1.8215259 1.3228883 1.4455041 1.4713896
##   [71] 1.4400545 1.1825613 1.5286104 1.5122616 1.4536785 1.4659401 1.2356948
##   [78] 1.3583106 1.7070845 1.1158038 1.5722071 1.3324251 1.8174387 1.9441417
##   [85] 1.2874659 1.8228883 1.2370572 1.7643052 1.3228883 1.2629428 1.3297003
##   [92] 1.4768392 1.2629428 1.2820163 1.5081744 1.2847411 1.2275204 1.4114441
##   [99] 1.9795640 1.2997275 1.2929155 1.4604905 1.2070845 1.6852861 1.2016349
##  [106] 1.2956403 1.2207084 1.2779292 1.7861035 1.4441417 1.3760218 1.1485014
##  [113] 1.7970027 1.2929155 1.6140187 1.0009346 1.2551402 1.2878505 1.2392523
##  [120] 0.9616822 1.4252336 1.6663551 1.1224299 0.8626168 1.3719626 1.4654206
##  [127] 1.5074766 1.3205607 1.3448598 1.4336449 1.4803738 1.2682243 1.2700935
##  [134] 1.2794393 1.3355140 1.3747664 1.4149533 1.3224299 1.0495327 1.5271028
##  [141] 1.1794393 1.2448598 1.4205607 1.3140187 1.3803738 1.3990654 1.3766355
##  [148] 1.2037383 1.0747664 1.3747664 1.2130841 1.2598131 1.1766355 1.2037383
##  [155] 1.3588785 1.4149533 1.2214953 1.5654206 1.4121495 1.5981308 1.4700935
##  [162] 1.4018692 1.0943925 1.2607477 1.4373832 1.1971963 1.2514019 1.3635514
##  [169] 1.4074766 1.3532710 1.4121495 1.3345794 1.3336449 1.0925234 1.3906542
##  [176] 1.2037383 1.1411215 1.2598131 1.4616822 1.4121495 1.5056075 1.2841121
##  [183] 1.2327103 1.0186916 1.3588785 0.9373832 1.5887850 0.9485981 1.4925234
##  [190] 0.9401869 1.3186916 1.0607477 1.3373832 1.4635514 1.5429907 1.4626168
##  [197] 1.2775701 1.4934579 1.4644860 1.3588785 1.2635514 1.3710280 1.1579439
##  [204] 1.0915888 0.9803738 1.1700935 0.9710280 1.1233645 1.3102804 1.1000000
##  [211] 1.1112150 0.9841121 1.0710280 1.3056075 1.4485981 1.0635514 1.2355140
##  [218] 1.1794393 1.4392523 1.3757009 1.3925234 1.1644860 1.1158879 1.5373832
##  [225] 1.3495327 1.2616822 1.0990654 1.3915888 1.6299065 1.2738318 1.2878505
##  [232] 1.2009346 1.1401869 1.3224299 1.1981308 1.0158879 1.4093458 1.0728972
##  [239] 1.0850467 1.0766355 1.2635514 1.1757009 1.6046729 1.3813084 1.2299065
##  [246] 1.1719626 1.0130841 1.2869159 0.9252336 0.9439252 1.0766355 1.3672897
##  [253] 1.1822430 1.0495327 1.2495327 1.1429907 1.3654206 1.1943925 1.0392523
##  [260] 1.2869159 1.3355140 1.0009346 1.0878505 1.1495327 1.1654206 1.2299065
##  [267] 1.1831776 1.2242991 1.3000000 1.3570093 1.2140187 1.0728972 1.3981308
##  [274] 1.3149533 1.3710280 1.1186916 1.3700935 1.2196262 1.3607477 1.3626168
##  [281] 1.6224299 1.2990654 1.1457944 1.0654206 1.4429907 1.1841121 1.3794393
##  [288] 1.5317757 1.1635514 1.4943925 1.3074766 1.2205607 1.0682243 1.1906542
##  [295] 1.7149533 1.3915888 1.0252336 1.3700935 1.1383178 1.3196262 0.9934579
##  [302] 1.2803738 1.3028037 1.0130841 1.2850467 1.0401869 1.0700935 1.1074766
##  [309] 1.5084112 0.9644860 1.1654206 1.1990654 1.1009346 1.3261682 1.2439252
##  [316] 1.3130841 1.4485981 1.2056075 1.3373832 1.5906542 1.1588785 1.2317757
##  [323] 1.2934579 0.9700935 1.6663551 1.6401869 0.9616822 1.6289720 1.2299065
##  [330] 1.4551402 0.9607477 1.4018692 1.5841121 1.0271028 1.1551402 1.4242991
##  [337] 1.2700935 1.1775701 1.3336449 1.1663551 1.0644860 1.1775701 1.4476636
##  [344] 1.2018692 1.0542056 1.0654206 1.2710280 1.4644860 1.2084112 1.2429907
##  [351] 1.2028037 1.2504673 1.4841121 1.2056075 1.4682243 1.3401869 1.0878505
##  [358] 1.0504673 1.0869159 1.4401869 0.9551402 1.2626168 1.1299065 1.3579439
##  [365] 1.3168224 2.4006462 2.5185784 2.7609047 2.5024233 2.4151858 2.0759289
##  [372] 2.4604200 2.4022617 2.4523425 2.3441034 2.2084006 2.6882068 2.4620355
##  [379] 2.2310178 2.4087237 2.8287561 2.6316640 2.3941842 2.0581583 2.5767367
##  [386] 2.6042003 2.6445880 2.0339257 1.9806139 2.2730210 2.0258481 2.4022617
##  [393] 2.0662359 2.0807754 2.6494346 1.7576737 1.9402262 2.0274637 2.2067851
##  [400] 2.1938611 2.1405493 2.5961228 1.8529887 2.4701131 2.3925687 2.2536349
##  [407] 1.8998384 1.8852989 2.2859451 2.3957997 2.3974152 2.3651050 2.1292407
##  [414] 2.0129241 2.2455574 1.9628433 2.1050081 2.0791599 2.6946688 2.4604200
##  [421] 2.6009693 2.1728595 2.5767367 1.9143780 2.2633279 2.0113086 2.4733441
##  [428] 2.5605816 2.1583199 2.1744750 2.2633279 2.0565428 2.2003231 2.5363489
##  [435] 2.1567044 2.1276252 2.0161551 2.4781906 2.2390953 2.6058158 2.3424879
##  [442] 2.3618740 2.1001616 2.0080775 2.3263328 2.0759289 2.4620355 2.3667205
##  [449] 2.2132472 2.0662359 2.2520194 2.1518578 2.3263328 2.0646204 1.9725363
##  [456] 2.1777060 2.5525040 1.8029079 2.6300485 1.9063005 2.4846527 2.6946688
##  [463] 2.3634895 1.9579968 2.4491115 1.9757674 2.1308562 2.5169628 2.4216478
##  [470] 2.4991922 2.1518578 2.4119548 1.9273021 2.0387722 1.9386107 2.0048465
##  [477] 2.3473344 1.9515347 1.7495961 2.1066236 1.8012924 2.0484653 2.4765751
##  [484] 2.1098546 2.8465267 2.3618740 2.1389338 2.5008078 2.5347334 1.8642973
##  [491] 1.7463651 2.5395800 1.9644588 2.3537964 2.4894992 2.5718901 2.4264943
##  [498] 2.1453958 2.3812601 2.2132472 2.2245557 1.9176090 2.2633279 2.1777060
##  [505] 2.3231018 2.4846527 2.4620355 1.8174475 2.6672052 1.9789984 2.3731826
##  [512] 1.9951535 2.8045234 1.8481422 2.2067851 2.6623586 2.4184168 2.0710824
##  [519] 2.5735057 1.8788368 2.0420032 2.1970921 2.3990307 2.4781906 2.2746365
##  [526] 1.9660743 2.2067851 2.2778675 2.0452342 1.9111470 2.4313409 2.1082391
##  [533] 2.5815832 2.0371567 2.4281099 2.4168013 2.4991922 2.1599354 1.7915994
##  [540] 2.2213247 1.8966074 1.9369952 2.0516963 2.0533118 2.6268174 1.8400646
##  [547] 2.2972536 2.0856220 2.1534733 2.0452342 2.0872375 2.0323102 2.5024233
##  [554] 2.0436187 2.2891761 2.1470113 2.3344103 2.3473344 2.0791599 2.2907916
##  [561] 2.1033926 2.0290792 2.1453958 2.1066236 2.1001616 2.4216478 2.2261712
##  [568] 2.2358643 2.5088853 2.1987076 2.1647819 2.5670436 2.0226171 1.9579968
##  [575] 1.8772213 2.1470113 2.2084006 1.9256866 2.2439418 1.8190630 2.1954766
##  [582] 1.9983845 2.2843296 1.9127625 2.0129241 1.8271405 2.3166397 2.3101777
##  [589] 2.2552504 2.5702746 1.9903069 2.2100162 1.9644588 1.1763708 0.9039005
##  [596] 0.9570379 1.3289994 1.1362352 1.1119276 0.9502544 0.7032222 0.9802148
##  [603] 0.9197287 1.0740531 0.9174675 0.8275862 0.8694178 0.8665913 1.0265687
##  [610] 1.0158282 0.5958168 0.6704353 0.8530243 0.8682872 0.8343697 0.5856416
##  [617] 1.0633126 0.8795930 1.0079141 0.7218768 0.8728095 0.8982476 1.0226116
##  [624] 0.8756360 1.0022612 1.0610514 0.8654607 0.8021481 1.0169587 0.7772753
##  [631] 1.0175240 0.9949124 0.9236857 0.7218768 0.7020916 0.8829847 0.9847371
##  [638] 0.7320520 1.0322216 0.6964387 0.8942906 0.7501413 0.8338044 0.9033352
##  [645] 0.9841718 1.0305257 1.1124929 0.9417750 0.8179763 0.8931600 0.9366874
##  [652] 0.8456755 0.7416620 0.6721311 0.8897682 1.0276993 0.8869418 1.0084794
##  [659] 0.9496891 1.0610514 0.7665348 0.7083098 0.9434709 0.9988694 0.5969474
##  [666] 0.6823064 1.0452233 0.7020916 0.9960430 0.6240814 0.9807801 0.9604296
##  [673] 1.1249293 1.0130017 0.8490673 0.7976258 0.7241379 0.8841153 0.8541549
##  [680] 0.9400791 1.0101752 1.0203505 0.9366874 0.9632561 0.7201809 0.9875636
##  [687] 0.9909553 1.0904466 0.9022046 0.9440362 0.8586772 0.9983041 0.6433013
##  [694] 0.7337479 1.0587903 0.7925382 0.8174110 1.0514415 0.7122668 0.7071792
##  [701] 0.6500848 0.9587337 1.1277558 0.6868287 0.8032787 1.2108536 0.6291690
##  [708] 0.7574901 1.0723573 0.9920859 0.8298474 0.7501413 0.8405879 0.8304127
##  [715] 0.6715659 0.7303561 0.7817976 0.9135105 0.9061617 0.8948559 0.9915206
##  [722] 0.8711136 1.0039570 0.6834370 0.9219898 0.9576032 0.7201809 0.9078575
##  [729] 1.0203505 0.7054833 1.0576597 0.6828717 0.7286603 0.8925947 0.6919163
##  [736] 0.5924251 0.7897117 0.7625777 0.5845110 0.6975692 1.0101752 0.6992651
##  [743] 0.9485585 0.7382702 0.7117015 0.7015263 0.6551724 0.7184850 0.8976823
##  [750] 0.7128321 0.6811758 0.7608819 0.6630865 0.6150367 0.7518372 1.0011306
##  [757] 0.9248163 0.6913510 0.8473714 0.7167891 1.0113058 0.6302996 0.9779536
##  [764] 0.7795365 0.6907858 0.7614471 0.6907858 0.6715659 0.8603731 0.7258338
##  [771] 0.6636518 0.6489542 0.6512154 0.5585076 0.6359525 0.7557942 0.7088751
##  [778] 0.6625212 0.7721877 1.2176371 0.8174110 0.7767100 0.8366309 0.9977388
##  [785] 0.8852459 0.7631430 0.6941775 0.8773318 0.7427925 0.7020916 0.7416620
##  [792] 0.6342566 0.8208027 0.6455625 0.7625777 0.7083098 0.5918598 0.6630865
##  [799] 0.8089316 0.6981345 0.9157716 0.6512154 0.8388920 0.7156586 0.8581119
##  [806] 0.7823629 0.6981345 0.5918598 0.6472583 0.7947993 0.7908423 0.6031656
##  [813] 0.6540418 0.6314302 0.8151498 0.7648389 0.6868287 0.7026569 0.9005088
##  [820] 1.2566422 0.8728095 0.7303561 0.7806670 0.8869418 0.8778971 0.8422838
##  [827] 0.7817976 1.0327869 1.1271905 0.8140192 0.8162804 0.8479367 1.2436405
##  [834] 0.8383267 0.9841718 0.8733748 0.8773318 0.7179197 0.7704918 0.6630865
##  [841] 1.0339175 0.7993217 1.3091922 0.8616527 1.2674095 0.8467967 1.2553389
##  [848] 1.3194058 1.4623955 1.4568245 1.4150418 1.3463324 1.1708449 1.2896936
##  [855] 1.0752089 1.4698236 1.2246982 1.3361188 1.2033426 1.5032498 1.0705664
##  [862] 1.2005571 1.3416899 1.4345404 1.2154132 0.9870009 1.2831941 1.4011142
##  [869] 1.2042711 1.3695450 1.0947075 0.9220056 0.9777159 1.2014856 1.3463324
##  [876] 1.4131848 1.3156917 1.1522748 1.3286908 1.2116992 1.2934076 0.8245125
##  [883] 1.3351903 1.1727019 0.8895079 1.2339833 1.2859796 0.9285051 1.2395543
##  [890] 1.0659239 1.1002786 1.5338904 1.0343547 0.9907149 0.8820799 1.1439183
##  [897] 1.0037140 1.1002786 1.0157846 1.3351903 1.2636955 1.1680594 1.0817084
##  [904] 0.8792943 1.4354689 0.9990715 1.3045497 1.3426184 1.3444754 1.4122563
##  [911] 0.8746518 0.9238626 0.8254410 1.1049211 1.2107707 0.8328691 1.1894150
##  [918] 0.7613742 0.9424327 1.4271123 0.9545032 1.3398329 1.2534819 1.4206128
##  [925] 0.9972145 1.3593315 1.1299907 1.0000000 1.1894150 1.3101207 1.3844011
##  [932] 1.3008357 0.9628598 1.3899721 0.9888579 0.9164345 1.4633240 0.8969359
##  [939] 1.2544104 1.3695450 1.3342618 1.1420613 1.4391829 0.8477252 1.1030641
##  [946] 1.1197772 1.2516249 1.2237697 1.1717734 1.0064995 1.0659239 1.4215413
##  [953] 1.4345404 1.3091922 1.2005571 1.2209842 0.9145775 1.2794800 1.2265552
##  [960] 1.3351903 1.3825441 1.3407614 0.8579387 1.0427112 1.5041783 1.2079851
##  [967] 1.0064995 0.7725162 0.9507892 1.2952646 1.3955432 1.3788301 0.9545032
##  [974] 1.2618384 1.0854225 0.8811513 0.9962860 1.0055710 1.0306407 0.9888579
##  [981] 1.2079851 1.2479109 1.4419684 1.4382544 1.1727019 0.9981430 1.0798514
##  [988] 1.1819870 1.1643454 1.2581244 1.1671309 0.9266481 1.3454039 1.3277623
##  [995] 1.3398329 1.2989786 0.7901578 1.0612813 1.1578459 1.4512535 0.8987929
## [1002] 0.9396472 0.8523677 1.1652739 1.2200557 1.0297122 1.3008357 1.2822656
## [1009] 1.0789229 1.2599814 0.8802228 1.0092851 1.2869081 0.8755803 0.9730734
## [1016] 1.3741876 0.9628598 1.1244197 0.9182916 0.9860724 1.1216342 0.9693593
## [1023] 1.2256267 0.9452182 0.9535747 1.1550604 0.7938719 1.0129991 0.9888579
## [1030] 1.0148561 1.1021356 1.0714949 1.1253482 1.2051996 0.7855153 1.1940576
## [1037] 1.0612813 1.3416899 1.3426184 1.1708449 0.9090065 0.9359331 0.9693593
## [1044] 1.0445682 0.8570102 0.8347261 1.2256267 0.8560817 1.1207057 1.3305478
## [1051] 0.8625812 1.1179201 0.7864438 0.7511606 0.7864438 0.8755803 0.9972145
## [1058] 0.8737233 0.8207985 1.0473538 1.0139276 0.8245125 1.0287837 0.9322191
## [1065] 1.1782730 0.8607242 1.0668524 1.0297122 0.9275766 1.0714949 0.8495822
## [1072] 0.9916435 1.1457753 1.0835655 0.9480037 1.3649025 0.9610028 0.9433612
## [1079] 0.8115135 0.9275766 1.0631383 0.8950789 1.2209842 0.7873723 1.0427112
## [1086] 0.9526462 0.8189415 1.0770659 1.0139276 0.9108635 0.9526462 0.9610028
## [1093] 0.7948004 0.8180130 0.9015785 0.8997214 0.5936698 0.6142002 0.7408041
## [1100] 0.7288281 0.5662960 0.6501283 0.8280582 0.7664671 0.6980325 0.7459367
## [1107] 0.7519247 0.9204448 0.7690334 0.7613345 0.8126604 0.7527802 0.7331052
## [1114] 0.6518392 0.7245509 0.5543199 0.8648417 0.5611634 0.8400342 0.6073567
## [1121] 0.5577417 0.6201882 0.7485030 0.5508982 0.5115483 0.5064157 0.7005988
## [1128] 0.5355004 0.7493584 0.5996578 0.6903336 0.6022241 0.5508982 0.6843456
## [1135] 0.7715997 0.5175364 0.7476476 0.7844311 0.6082121 0.6937553 0.6603935
## [1142] 0.6937553 0.8075278 0.5004277 0.7835757 0.6826347 0.7151411 0.6244654
## [1149] 0.6347305 0.7031651 0.5859709 0.7562019 0.4935843 0.7861420 0.6646707
## [1156] 0.5885372 0.5124038 0.6911891 0.7562019 0.7810094 0.5252352 0.5928144
## [1163] 0.6954662 0.6834902 0.6347305 0.4901625 0.5098375 0.8195038 0.7048760
## [1170] 0.8092387 0.7365269 0.7254063 0.5124038 0.6852010 0.8015398 0.7074423
## [1177] 0.4764756 0.8374679 0.6313088 0.9213003 0.7168520 0.6407186 0.5166809
## [1184] 0.6732250 0.6484175 0.7288281 0.6663815 0.6809239 0.6834902 0.7399487
## [1191] 0.6561163 0.4619333 0.6544055 0.7536356 0.7082977 0.5765612 0.7639008
## [1198] 0.6586826 0.7758768 0.8622754 0.7048760 0.6988879 0.5919589 0.6852010
## [1205] 0.5508982 0.5654405 0.9443969 0.5414885 0.6424294 0.6535500 0.6698033
## [1212] 0.6047904 0.6980325 0.5996578 0.5671514 0.7698888 0.6552609 0.5166809
## [1219] 0.7005988 0.7750214 0.6723695 0.7390932 0.5072712 0.5842601 0.6347305
## [1226] 0.5320787 0.7279726 0.7023097 1.0402053 0.7159966 0.7852866 0.7399487
## [1233] 0.6372968 0.7801540 0.7810094 0.5757057 0.7100086 0.5329341 0.7946963
## [1240] 0.8118050 0.7117194 0.6039350 0.5834046 0.5722840 0.5209581 0.5645851
## [1247] 0.5260907 0.7536356 0.7519247 0.6509837 0.6852010 0.6817793 0.7254063
## [1254] 0.5226689 0.6757913 0.6740804 0.8733961 0.6766467 0.5372113 0.5859709
## [1261] 0.5996578 0.5278015 0.5662960 0.5697177 0.9144568 0.5183918 0.8443114
## [1268] 0.4961506 0.6424294 0.6073567 0.7271172 0.7416595 0.6980325 0.7912746
## [1275] 0.4987169 0.7408041 0.7698888 0.5919589 0.9153122 0.8879384 0.5585971
## [1282] 1.0196749 0.7177074 0.7792985 0.4961506 0.7613345 0.4918734 0.8597092
## [1289] 0.8118050 0.8126604 0.6150556 0.6398631 0.7741660 0.6193328 0.7741660
## [1296] 0.7134303 0.7656116 0.5543199 0.8408896 0.9153122 0.5089820 0.6424294
## [1303] 0.7613345 0.6441403 0.6971771 0.6065013 0.6586826 0.7245509 0.8066724
## [1310] 0.6792130 0.5551754 0.5440547 0.5500428 0.4935843 0.5594525 0.6278871
## [1317] 0.5303678 0.5372113 1.0196749 0.5072712 0.7493584 0.4867408 0.6364414
## [1324] 0.6304534 0.6278871 0.4790419 0.5466210 0.8220701 0.6552609 0.7510693
## [1331] 0.6372968 0.9238666 0.6621044 0.6766467 0.5423439 0.7887083 0.5568862
## [1338] 0.5038494 0.5440547 0.5329341 0.6099230 0.4927288 0.5662960 0.9409752
## [1345] 0.8768178 0.5628743 0.6099230 0.9580838 0.7946963 0.6304534 0.6954662
## [1352] 0.9315654 0.5115483 0.5859709 0.5124038 0.5936698 0.5055603 0.5953807
## [1359] 0.5628743 0.5457656 0.4764756 0.5303678 0.5320787 0.6287425 0.5876818
## [1366] 0.5226689 0.4850299 0.4875962 0.5329341 0.4550898 0.5124038 0.4739093
## [1373] 0.5457656 0.5919589 0.6809239 0.7775877 0.7510693 0.4850299 0.7570573
## [1380] 0.6578272 0.7784431 0.5534645 0.7818648 0.6065013 0.5474765 0.4465355
## [1387] 1.1558219 0.8869863 0.7106164 1.0513699 0.7380137 1.8202055 1.1712329
## [1394] 0.8750000 1.4143836 1.2465753 0.8424658 0.9845890 0.8904110 1.4880137
## [1401] 1.3767123 1.4845890 1.2808219 0.9863014 1.4178082 1.8544521 1.1832192
## [1408] 1.5976027 1.4777397 1.0445205 1.0393836 0.9537671 1.0119863 1.5565068
## [1415] 1.3013699 1.3116438 1.6284247 1.2517123 1.1609589 1.3578767 1.8407534
## [1422] 1.3133562 0.9914384 1.0856164 1.2996575 1.1369863 1.9400685 1.4537671
## [1429] 1.6438356 1.6609589 1.6541096 1.1575342 1.3407534 1.5513699 1.3184932
## [1436] 1.0445205 0.9828767 1.0599315 2.0428082 1.9554795 0.9434932 1.5085616
## [1443] 1.3493151 1.8407534 1.0993151 1.1643836 1.4503425 1.3921233 1.7825342
## [1450] 1.4315068 1.1643836 1.6369863 1.2123288 1.3116438 2.0222603 1.6746575
## [1457] 1.1489726 0.9828767 1.3184932 1.4708904 1.9417808 1.8082192 1.5633562
## [1464] 1.6678082 1.1404110 1.6489726 1.6626712 1.2945205 1.1027397 1.9982877
## [1471] 1.6164384 1.1061644 1.3184932 1.2517123 1.4366438 1.7243151 1.1815068
## [1478] 1.3339041 1.9691781 1.8681507 1.1592466 1.8715753 1.9931507 1.0702055
## [1485] 1.2106164 1.8698630 1.6335616 1.8972603 1.5547945 1.2482877 1.6558219
## [1492] 1.1267123 1.8493151 1.4931507 1.5034247 1.8561644 1.3458904 1.3698630
## [1499] 1.3441781 1.5684932 1.6643836 1.5890411 1.7962329 1.3732877 1.8664384
## [1506] 1.6027397 1.3595890 1.6678082 1.5684932 1.3065068 1.8116438 1.0462329
## [1513] 1.1883562 1.4589041 1.0051370 1.5770548 1.5753425 1.4195205 1.2534247
## [1520] 1.8047945 1.6232877 1.2465753 1.9554795 1.7893836 1.1472603 1.3989726
## [1527] 1.5239726 1.7243151 1.8047945 1.2089041 1.1404110 1.3304795 1.5034247
## [1534] 1.7534247 1.3715753 1.7089041 1.7876712 1.5256849 1.0616438 1.6575342
## [1541] 1.0890411 1.6609589 1.3030822 1.6746575 1.2003425 1.2705479 1.5308219
## [1548] 1.1438356 1.4366438 1.2345890 1.2979452 1.4023973 1.7910959 1.2054795
## [1555] 1.7157534 1.5702055 1.6541096 1.3613014 1.2054795 1.2448630 1.9537671
## [1562] 1.2003425 1.3886986 1.0496575 1.9109589 1.8167808 1.8202055 1.3356164
## [1569] 1.6969178 1.7791096 1.1746575 1.6250000 1.9263699 1.7654110 1.4743151
## [1576] 1.2808219 1.1729452 1.6660959 1.5890411 1.7174658 1.7140411 1.2140411
## [1583] 1.3544521 1.5325342 1.2568493 1.4400685 1.8082192 1.0650685 1.1934932
## [1590] 1.3082192 1.3852740 1.7157534 1.1934932 1.5941781 1.2448630 1.5821918
## [1597] 1.0702055 0.9315068 1.4400685 1.3989726 1.2037671 1.3270548 1.8904110
## [1604] 1.5239726 1.1832192 1.2619863 1.7688356 0.9982877 1.1215753 1.0496575
## [1611] 1.2020548 1.0976027 1.2294521 1.1318493 1.4794521 1.6917808 1.1558219
## [1618] 1.2551370 1.0770548 1.4092466 1.3253425 1.4828767 1.2739726 1.6883562
## [1625] 1.1232877 1.4212329 1.3476027 1.1815068 1.1455479 1.6232877 1.7260274
## [1632] 1.3681507 1.3030822 1.2551370 1.3767123 1.1609589 1.4537671 1.1900685
## [1639] 1.0380294 1.1961971 1.1789110 2.1037165 1.3189283 1.3535004 2.3517718
## [1646] 1.8150389 2.0959378 1.5721694 2.3223855 1.9887640 1.7493518 2.1002593
## [1653] 2.2247191 1.9956785 1.6058773 2.1313742 1.9732066 2.1184097 2.0838375
## [1660] 1.5937770 1.6923077 1.2731201 2.0570441 1.5012965 1.4062230 1.7631806
## [1667] 2.0682800 1.6179775 1.5133967 2.2048401 1.1884183 1.4511668 1.9006050
## [1674] 1.7182368 1.8245462 1.4079516 1.4036301 1.1685393 1.5401901 1.8262748
## [1681] 1.8599827 1.1849611 1.8228176 1.3448574 1.5808124 1.9654278 1.0829732
## [1688] 1.9792567 1.4554883 1.4719101 1.7502161 1.5479689 1.3941227 1.5237684
## [1695] 1.9956785 1.6897148 2.2601556 1.7528090 2.0397580 1.4926534 1.9049265
## [1702] 2.4304235 1.9930856 1.2264477 1.4053587 1.9049265 1.2048401 1.4312878
## [1709] 1.8504754 1.7960242 2.3941227 1.2610199 1.2186690 2.2532411 1.7182368
## [1716] 2.3898012 1.3258427 1.2394123 2.1970614 1.4088159 1.7856525 1.0242005
## [1723] 1.6464996 1.5764909 2.1080380 1.6542783 1.7433016 2.2705272 1.8513397
## [1730] 2.1944685 1.8893691 1.4511668 2.3318928 1.4987035 1.2713915 1.6378565
## [1737] 2.1970614 1.3120138 2.1045808 1.3828868 1.4589455 1.2696629 1.1037165
## [1744] 2.1201383 1.1771824 1.2350908 1.9066551 1.2696629 1.9818496 1.1728608
## [1751] 1.9334486 1.6560069 2.0069144 2.0440795 2.1780467 1.7579948 2.1460674
## [1758] 1.8357822 2.4554883 1.9913570 2.3846154 2.0613656 1.4667243 2.2203976
## [1765] 1.8262748 1.7925670 1.0803803 1.1547105 1.2186690 1.9057908 2.3699222
## [1772] 2.0622299 1.3716508 2.1408816 2.2428695 2.2791703 2.4200519 2.2480553
## [1779] 2.2713915 1.7709594 1.7346586 2.1261884 2.2065687 1.8349179 2.0596370
## [1786] 1.8885048 1.8988764 1.5583405 1.6335350 2.0069144 2.2186690 1.1210026
## [1793] 2.3984443 1.6343993 2.2169404 1.9913570 1.4874676 1.9161625 2.3318928
## [1800] 1.2731201 2.1296456 1.4805532 1.4044944 1.2541054 1.1754538 1.9455488
## [1807] 2.0751945 2.1676750 1.4122731 1.6101988 1.9239412 2.0743302 1.7216940
## [1814] 1.9714780 1.2765774 1.8833189 2.2368194 1.5496975 1.1659464 1.4814175
## [1821] 1.4615385 1.3595506 1.3275713 1.9515990 1.1659464 2.1140882 1.9671564
## [1828] 1.2549697 1.1771824 1.3457217 1.1037165 1.2143475 1.1521175 1.0388937
## [1835] 1.1253241 1.0475367 1.2195333 1.2566984 1.4036301 1.2195333 1.1426102
## [1842] 1.0544512 1.4399309 2.3379430 1.2108902 1.3284356 1.5367329 1.3163354
## [1849] 1.1080380 1.4909248 1.3405359 2.0181504 1.4874676 1.1771824 1.2169404
## [1856] 1.1607606 1.2532411 2.3241141 1.2558341 2.0103717 1.4226448 2.1391530
## [1863] 1.3785653 1.1840968 1.4848747 1.2826275 1.3336214 1.0458081 1.3068280
## [1870] 1.1815039 1.1573034 1.1633535 1.1866897 1.9075194 1.7934313 1.7770095
## [1877] 1.4373379 1.2722558 1.1832325 1.0959378 1.6655143 1.1452031 1.1979257
## [1884] 1.0319793 1.1884183 1.8193604 1.2463267 1.1581677 1.2055838 1.9035533
## [1891] 1.8375635 1.9784264 1.6827411 1.7055838 1.8515228 1.7119289 1.5241117
## [1898] 1.7474619 2.1535533 1.9454315 1.6230964 1.2880711 1.6954315 1.5659898
## [1905] 1.5012690 1.4010152 2.1522843 1.2347716 1.1294416 1.4923858 1.8083756
## [1912] 1.5812183 1.7626904 1.2436548 1.7804569 1.7246193 1.1256345 1.8096447
## [1919] 1.6535533 1.7804569 1.7956853 1.8388325 1.5380711 1.8007614 1.8185279
## [1926] 1.6446701 2.0215736 1.8489848 1.7918782 1.3413706 1.9492386 1.9784264
## [1933] 1.7550761 1.9327411 2.0507614 1.7880711 1.3502538 1.8147208 1.8642132
## [1940] 1.9200508 1.6916244 1.2487310 1.3565990 1.0228426 1.6218274 1.1560914
## [1947] 1.3058376 1.8439086 1.0964467 1.7119289 1.7944162 2.0989848 1.8362944
## [1954] 1.4403553 1.0114213 2.0545685 1.1624365 1.2208122 1.1675127 1.9847716
## [1961] 1.5038071 1.3705584 1.1548223 1.9225888 1.7461929 1.9784264 1.9073604
## [1968] 1.3045685 1.5964467 1.4860406 1.7081218 1.9428934 1.8299492 1.5672589
## [1975] 1.7347716 1.2728426 1.2017766 1.1294416 1.1167513 1.3058376 1.2043147
## [1982] 1.6078680 1.3172589 1.4200508 1.2791878 1.4860406 1.4733503 1.2994924
## [1989] 1.5152284 1.8451777 1.4416244 1.2829949 1.4238579 1.4758883 1.4022843
## [1996] 1.3159898 1.4403553 2.1027919 1.9238579 1.4733503 1.6116751 1.6713198
## [2003] 1.6992386 1.5647208 1.7449239 1.6459391 1.4631980 1.9720812 1.6395939
## [2010] 1.1510152 1.8401015 1.3071066 1.3147208 1.4111675 1.7550761 1.9060914
## [2017] 1.3032995 1.4251269 1.4010152 1.2855330 1.9124365 1.4162437 1.6763959
## [2024] 1.5723350 1.8845178 1.3629442 1.2639594 1.4949239 1.1522843 1.4644670
## [2031] 1.5812183 1.7208122 1.6332487 1.7563452 1.9530457 2.0393401 1.7182741
## [2038] 1.7220812 1.3972081 1.7804569 1.6015228 1.9873096 1.8515228 1.4187817
## [2045] 1.4974619 1.6218274 1.1040609 1.8223350 2.1586294 1.4530457 1.5964467
## [2052] 1.5266497 1.5621827 1.2563452 1.2017766 1.8972081 1.2487310 1.6269036
## [2059] 1.4289340 1.9124365 1.9073604 2.0812183 1.8426396 1.5406091 1.2715736
## [2066] 1.8413706 1.7233503 1.5786802 1.9124365 1.7500000 1.6776650 1.7690355
## [2073] 1.5177665 1.5063452 1.9073604 1.4454315 1.7233503 1.6725888 2.1395939
## [2080] 1.5761421 1.3781726 1.3223350 1.7144670 1.9352792 1.2360406 1.6205584
## [2087] 1.9657360 1.9200508 1.7525381 1.4467005 2.3324873 1.5520305 1.5913706
## [2094] 1.3401015 1.6281726 1.6624365 1.2373096 1.7093909 1.8781726 1.6269036
## [2101] 1.9784264 1.8921320 1.4961929 1.4784264 1.8058376 1.8413706 1.1256345
## [2108] 1.2487310 1.5177665 1.5152284 1.7703046 1.6967005 1.6725888 1.8667513
## [2115] 2.2461929 1.3375635 1.4098985 1.5228426 1.7525381 2.0126904 1.7106599
## [2122] 1.7500000 1.3324873 1.1700508 1.5342640 1.6205584 2.1865482 1.4758883
## [2129] 1.8223350 1.5875635 1.3769036 1.9606599 1.5266497 2.1548223 1.4137056
## [2136] 1.0507614 1.8565990 2.1040609 1.6751269 1.9505076 2.1116751 1.7284264
## [2143] 1.2626904 1.4035533 1.6662437 1.2246193 2.2906091 1.4670051 1.4352792
## [2150] 1.4314721 1.2652284 1.7639594 1.2170051 1.8769036 1.3375635 1.2918782
## [2157] 1.6192893 1.1218274 1.2664975 2.0545685 1.2715736 1.2461929 1.3921320
## [2164] 1.4796954 1.4365482 2.0964467 1.6497462 1.4416244 1.5647208 1.7703046
## [2171] 1.6065990 1.6802030 2.1802030 1.9809645 1.3286802 1.4149746 1.7868020
## [2178] 1.8426396 2.2030457 1.9530457 1.7220812 1.6878173 1.7588832 1.3071066
## [2185] 1.2576142 1.8261421 1.2182741 1.1763959 1.3908629 1.3426396 2.1319797
## [2192] 1.0291878 1.1586294 1.2741117 1.4619289 1.5368020 1.1142132 1.9048223
## [2199] 1.4111675 1.2753807 1.6154822 1.5850254 1.4911168 1.8020305 1.5291878
## [2206] 1.9263959 1.5888325 1.5431472 1.1357868 1.4543147 1.2588832 1.2043147
## [2213] 1.3071066 1.2753807 1.1890863 1.5697970 2.2779188 1.4086294 1.3680203
## [2220] 1.7614213 1.3223350 1.2829949 1.2652284 1.7043147 1.8654822 1.7043147
## [2227] 1.4746193 1.8540609 1.6015228 1.1827411 2.1027919 1.2131980 1.7855330
## [2234] 1.3730964 2.1928934 1.2449239 1.2436548 1.3794416 1.3908629 1.7715736
## [2241] 1.7677665 1.4010152 1.1903553 1.2931472 1.1916244 1.5279188 1.3261421
## [2248] 1.6637056 1.3578680 1.2170051 1.5558376 1.2664975 1.6078680 1.3604061
## [2255] 1.3274112 1.1827411 1.7309645 1.1916244 1.3921320 1.6967005 1.1228669
## [2262] 1.0025597 1.1663823 1.0904437 1.0750853 1.0315700 1.4172355 1.1817406
## [2269] 1.3575085 1.1877133 1.1049488 1.2935154 1.2090444 1.2167235 1.1911263
## [2276] 1.1689420 1.5443686 1.0418089 1.3558020 1.1689420 1.3139932 1.2474403
## [2283] 1.2883959 1.3165529 1.0119454 1.6083618 1.3583618 1.3788396 1.0887372
## [2290] 1.1783276 1.1834471 1.2704778 1.2593857 1.1689420 1.3011945 1.2064846
## [2297] 1.3250853 1.1049488 1.2116041 0.9650171 1.2235495 1.3566553 1.1134812
## [2304] 1.2090444 1.3626280 1.2525597 1.3558020 0.9752560 1.1783276 1.3310580
## [2311] 1.1527304 1.4539249 1.2133106 1.1493174 1.1348123 1.2619454 1.2730375
## [2318] 1.6049488 1.2124573 1.2559727 1.2346416 1.3438567 1.1228669 1.0383959
## [2325] 1.3447099 1.2824232 1.1424915 1.3976109 0.8993174 1.4325939 1.2627986
## [2332] 1.1715017 1.3353242 1.0861775 0.9820819 1.2500000 0.9334471 0.9325939
## [2339] 0.8617747 1.0238908 0.9880546 1.6322526 1.1706485 1.1134812 1.0435154
## [2346] 1.0904437 0.8899317 1.2192833 1.3447099 1.1953925 1.2431741 1.4206485
## [2353] 1.1373720 1.0435154 1.3395904 1.2013652 1.2440273 1.0614334 1.4078498
## [2360] 1.4146758 1.2389078 1.3063140 1.1911263 1.1168942 1.3174061 1.2261092
## [2367] 1.3336177 1.1075085 1.1407850 1.0853242 1.4343003 1.4308874 1.2414676
## [2374] 1.1894198 1.2704778 1.2397611 1.3404437 0.9769625 1.3626280 0.9769625
## [2381] 1.2209898 0.9291809 1.3737201 1.3114334 1.0119454 1.3592150 1.0819113
## [2388] 1.2926621 1.0366894 1.4351536 1.4633106 1.3174061 1.3344710 1.2866894
## [2395] 1.1433447 0.9803754 1.4351536 1.0341297 1.2047782 1.0375427 1.1493174
## [2402] 1.5912969 1.4104096 1.5358362 1.0972696 1.3634812 1.3856655 1.4300341
## [2409] 1.6271331 1.3558020 1.2098976 1.3703072 1.2397611 1.3242321 1.4837884
## [2416] 1.1049488 1.0392491 1.4633106 1.3600683 1.1885666 1.4334471 1.3054608
## [2423] 1.1902730 0.9206485 1.0571672 0.9317406 0.9155290 1.0656997 1.4112628
## [2430] 0.8890785 1.3839590 1.1493174 1.4436860 1.1160410 0.9675768 1.0110922
## [2437] 1.2627986 1.3651877 1.3796928 1.1433447 1.1808874 1.3063140 1.5793515
## [2444] 1.3907850 1.0622867 1.1211604 1.3540956 0.9291809 1.4453925 1.1587031
## [2451] 1.3856655 1.0708191 0.9061433 1.2047782 1.0571672 1.3583618 1.4692833
## [2458] 1.4317406 1.1518771 1.3114334 1.2542662 1.0964164 1.1373720 1.2994881
## [2465] 1.2457338 1.3447099 1.3600683 1.3805461 0.8720137 0.9556314 1.3003413
## [2472] 1.3165529 1.3293515 1.0554608 1.5315700 1.1544369 1.1919795 1.3472696
## [2479] 1.2226962 1.3370307 1.2184300 1.3438567 1.6459044 1.0332765 1.2073379
## [2486] 1.5042662 1.1134812 0.8976109 1.2192833 1.4803754 1.1774744 0.8344710
## [2493] 1.3430034 1.2619454 1.2269625 1.4001706 1.3447099 1.4308874 1.1484642
## [2500] 1.2653584 1.1390785 1.2627986 1.3276451 1.1791809 1.0742321 1.1186007
## [2507] 1.4530717 1.4436860 1.0392491 0.8626280 1.3634812 1.1186007 0.8745734
## [2514] 1.1305461 1.3907850 1.2909556 1.2337884 0.9163823 0.9249147 1.5836177
## [2521] 1.1501706 1.3677474 1.1561433 1.3643345 1.4121160 1.3037543 1.3464164
## [2528] 1.1578498 1.3907850 1.2500000 1.2261092 1.3941980 1.2423208 0.9411263
## [2535] 1.1561433 1.0383959 1.1279863 1.1723549 1.2047782 1.2039249 1.1245734
## [2542] 1.0008532 1.1092150 1.2602389 0.9087031 0.8139932 1.3566553 1.1100683
## [2549] 1.0281570 0.8984642 0.9803754 1.0998294 1.0051195 1.2252560 1.1322526
## [2556] 1.1279863 1.2474403 0.8796928 1.2397611 1.2013652 0.9462457 1.4283276
## [2563] 1.1655290 0.8890785 1.4470990 1.3668942 1.1646758 1.3071672 0.9035836
## [2570] 1.0145051 0.9351536 1.0511945 1.0947099 1.3959044 1.1672355 0.9880546
## [2577] 1.0674061 1.4215017 1.0733788 1.2849829 1.0793515 1.1023891 1.2440273
## [2584] 0.8822526 1.2030717 1.4163823 1.3259386 1.2653584 1.2687713 1.1945392
## [2591] 0.9991468 1.1416382 1.3199659 1.2679181 0.9488055 0.9633106 1.1296928
## [2598] 1.0972696 1.2687713 1.0204778 1.1075085 1.3472696 1.1058020 0.9377133
## [2605] 1.2039249 1.3412969 1.5759386 0.8523891 1.3788396 0.9505119 1.0324232
## [2612] 1.0358362 1.1365188 1.0025597 0.9232082 1.2474403 0.9931741 1.3250853
## [2619] 1.0409556 0.8472696 1.4667235 1.3353242 1.1817406 1.3668942 1.4257679
## [2626] 1.0349829 1.0750853 1.2064846 0.9360068 1.4607509 1.3165529 1.0742321
## [2633] 1.1126280 1.0870307 1.3515358 1.2320819 0.8933447 0.9377133 1.0204778
## [2640] 1.2414676 1.4624573 0.8430034 1.3677474 1.3063140 1.0469283 1.2286689
## [2647] 1.0776451 1.0674061 1.3122867 1.2354949 1.1783276 0.8899317 1.1450512
## [2654] 1.2926621 1.1757679 1.0051195 0.9829352 0.9385666 0.8660410 1.1168942
## [2661] 1.3370307 1.1296928 1.1271331 1.2807167 0.9129693 1.0093857 0.9573379
## [2668] 1.2841297 1.0341297 1.0358362 0.8438567 1.1569966 0.9607509 1.0452218
## [2675] 1.1015358 0.9104096 1.2380546 0.8873720 1.3302048 0.8370307 1.1075085
## [2682] 0.9257679 1.3421502 1.3122867 0.8395904 0.8941980 0.9820819 1.0998294
## [2689] 1.2209898 1.1894198 1.1203072 1.2423208 1.2312287 0.8848123 1.5247440
## [2696] 0.9718430 1.0921502 1.3754266 1.0887372 1.0631399 0.8617747 0.8165529
## [2703] 1.2943686 0.9769625 1.2167235 1.1040956 0.9488055 1.4877480 0.7479580
## [2710] 1.5390898 1.1995333 1.7106184 1.2077013 1.4492415 2.0116686 1.6557760
## [2717] 1.9801634 1.5682614 0.7339557 0.9953326 2.0525088 1.4725788 1.7969662
## [2724] 1.7304551 0.9311552 1.6242707 0.8798133 1.8448075 2.0851809 1.1306884
## [2731] 0.7619603 1.3570595 0.7969662 1.1411902 0.9579930 1.7176196 1.1913652
## [2738] 2.1796966 1.9241540 1.5705951 1.9218203 0.7467911 1.4539090 1.4842474
## [2745] 1.0326721 0.7899650 1.0665111 1.6814469 1.7642940 0.9323221 1.2007001
## [2752] 1.1493582 0.9591599 1.4387398 0.6569428 1.8879813 0.5565928 2.0991832
## [2759] 1.3173862 1.9871645 1.8856476 1.3255543 1.5787631 0.8004667 0.9008168
## [2766] 1.0513419 1.1691949 2.2567095 1.2007001 2.1271879 1.6534422 1.2567095
## [2773] 0.9731622 1.4492415 2.0536756 0.9498250 1.8004667 1.9626604 1.7386231
## [2780] 1.6791132 0.9754959 1.4072345 1.9276546 0.8599767 1.0746791 1.4865811
## [2787] 1.4912485 1.2975496 0.8646441 0.9521587 0.7934656 0.8926488 0.7024504
## [2794] 1.8763127 1.5927655 1.4807468 0.8844807 1.5122520 1.1400233 1.4784131
## [2801] 1.1330222 1.8401400 1.6989498 1.6219370 1.2345391 2.1365228 2.1143524
## [2808] 1.0816803 1.5624271 1.8494749 0.8413069 0.9101517 1.6149358 1.1575263
## [2815] 1.6931155 0.7094516 0.7572929 0.7234539 1.3290548 1.9731622 2.0711785
## [2822] 0.6569428 1.7409568 1.5857643 2.1341890 2.0898483 1.4317386 1.5659277
## [2829] 1.9754959 0.9533256 0.9428238 1.7082847 0.8074679 1.2508751 1.0093349
## [2836] 1.6149358 1.2147025 1.3908985 1.2030338 1.7409568 0.6172695 1.9988331
## [2843] 1.6511085 0.8074679 1.4562427 0.9848308 1.1376896 1.2683781 1.8191365
## [2850] 1.0455076 1.6079347 1.1645274 1.8611435 1.4410735 1.7479580 1.7537923
## [2857] 1.7036173 0.7561260 0.7199533 1.8996499 1.3290548 1.6347725 2.0746791
## [2864] 1.6394399 1.5682614 0.6429405 0.6767795 1.4889148 1.3978996 1.1971995
## [2871] 1.0758460 1.4084014 1.2438740 1.5134189 1.3698950 0.9136523 0.9393232
## [2878] 0.8401400 2.1400233 0.8996499 1.6639440 1.1178530 1.3932322 1.8168028
## [2885] 0.7642940 2.0396733 1.7806301 0.6569428 2.0420070 1.6826138 1.6359393
## [2892] 1.5565928 1.6534422 0.7491249 0.6032672 1.5449242 1.1575263 1.7701284
## [2899] 1.4387398 0.7129522 0.7199533 1.4200700 0.9813302 0.7339557 1.7234539
## [2906] 1.1015169 0.6499417 1.2532089 1.1715286 1.2018670 1.3302217 0.7024504
## [2913] 1.0653442 0.6219370 1.9369895 0.7794632 1.5472579 1.3220537 0.9673279
## [2920] 1.6126021 0.8856476 1.3430572 1.3943991 1.2508751 1.1633606 1.3628938
## [2927] 1.4620770 1.0571762 1.4714119 1.5087515 1.2473746 1.6674446 1.0140023
## [2934] 1.2870478 1.2263711 0.8016336 1.8273046 0.6464411 0.6697783 0.6861144
## [2941] 1.5425904 0.8623104 1.7771295 1.5880980 0.7467911 0.5635939 0.6837806
## [2948] 0.8238040 1.6301050 1.5075846 1.8156359 0.6441074 0.9241540 0.7584597
## [2955] 0.7129522 0.8121354 0.8588098 1.5122520 0.8681447 1.8331389 1.4434072
## [2962] 0.6662777 0.6406068 0.6686114 0.7491249 1.4130688 1.2917153 1.4737456
## [2969] 2.0641774 1.1236873 1.3745624 1.9533256 1.4177363 1.2357060 1.3745624
## [2976] 0.8109685 0.7421237 0.7001167 1.6126021 0.5939323 1.1575263 1.5915986
## [2983] 1.4597433 0.8354726 2.2940490 1.4049008 0.7421237 0.6301050 0.6207701
## [2990] 1.5204201 0.7164527 1.5297550 0.8494749 2.3278880 0.7292882 1.4445741
## [2997] 1.6172695 0.7047841 1.3570595 1.9148191 0.7024504 0.8319720 0.6732789
## [3004] 0.7736289 0.7327888 1.8214702 0.5869312 1.1540257 1.0163361 0.6791132
## [3011] 0.8098016 0.7596266 0.8273046 0.7712952 0.6359393 0.7654609 0.8634772
## [3018] 0.8389732 1.5694282 1.0326721 1.5554259 0.7409568 0.7701284 0.8576429
## [3025] 1.2917153 0.6896149 0.6406068 0.7689615 1.9019837 0.6464411 0.7024504
## [3032] 0.8436406 1.0641774 0.6814469 1.2030338 0.8203034 0.4889148 0.6756126
## [3039] 1.1866978 0.6289382 1.5414236 0.7187865 0.7514586 0.7899650 0.7199533
## [3046] 0.6931155 0.6441074 0.7141190 0.6569428 1.2406877 2.0315186 1.5229226
## [3053] 2.0014327 2.0143266 1.9054441 1.7865330 2.0100287 1.8581662 1.9871060
## [3060] 1.8424069 2.2191977 1.9484241 1.6704871 1.8037249 1.5988539 1.7177650
## [3067] 1.9025788 1.5759312 1.5573066 1.9369628 1.3982808 1.7578797 1.7736390
## [3074] 1.7349570 1.6633238 1.7048711 1.7707736 1.7392550 1.3796562 1.7521490
## [3081] 1.5042980 1.3065903 1.8180516 1.6418338 1.6690544 1.6604585 1.7936963
## [3088] 1.6088825 1.5329513 1.6948424 1.6318052 1.7492837 1.8810888 1.7034384
## [3095] 1.7650430 1.4928367 1.7421203 1.7206304 1.8997135 1.8481375 1.5257880
## [3102] 1.8567335 1.7521490 1.6518625 1.5902579 1.7679083 1.7722063 1.6045845
## [3109] 1.6962751 1.5157593 1.3323782 1.6117479 1.4971347 1.3853868 1.9255014
## [3116] 1.6876791 1.6446991 1.4627507 1.8037249 1.6088825 1.7363897 1.4742120
## [3123] 1.8438395 1.5702006 1.7535817 1.6891117 1.5071633 1.8681948 1.6848138
## [3130] 1.6031519 1.5343840 1.6446991 1.5630372 1.5257880 1.7893983 1.5587393
## [3137] 1.3295129 1.3853868 1.6504298 1.8710602 1.5644699 1.4885387 1.6891117
## [3144] 1.7091691 1.6547278 1.2851003 1.9512894 1.6418338 1.3911175 1.7979943
## [3151] 1.8280802 1.9255014 1.8266476 1.7234957 1.3825215 1.4656160 1.6676218
## [3158] 1.6575931 1.7191977 1.8839542 1.9455587 1.5257880 1.4684814 1.5487106
## [3165] 1.3839542 1.3638968 1.4598854 1.9957020 1.6232092 1.5530086 1.4283668
## [3172] 1.5673352 1.7449857 1.7707736 1.3954155 1.5415473 1.7106017 1.3338109
## [3179] 1.6475645 1.8467049 1.9183381 1.1891117 1.7550143 1.3667622 1.6862464
## [3186] 1.3108883 1.7005731 1.4555874 1.1876791 1.5472779 1.4871060 1.7048711
## [3193] 1.7220630 1.6833811 1.4828080 1.5057307 1.4942693 1.8123209 1.8724928
## [3200] 1.7063037 1.7607450 1.7134670 1.6962751 1.4770774 1.6762178 1.6805158
## [3207] 1.9355301 1.6303725 1.8008596 1.4154728 1.6547278 1.4785100 1.7163324
## [3214] 1.6575931 1.7120344 1.4770774 1.8123209 1.6934097 1.6733524 1.5744986
## [3221] 1.6575931 1.7421203 1.8911175 1.9541547 1.6074499 1.7134670 1.4684814
## [3228] 1.7621777 1.5444126 1.2535817 1.2736390 1.3495702 1.6432665 1.3094556
## [3235] 1.4670487 1.1432665 1.7593123 1.8467049 1.7363897 1.5902579 1.8853868
## [3242] 2.0157593 1.8237822 1.6977077 1.7263610 1.7722063 2.0830946 1.9340974
## [3249] 1.9312321 1.5974212 1.3638968 1.8510029 1.6060172 1.4426934 1.8008596
## [3256] 1.6604585 1.1618911 1.6045845 1.7965616 1.7808023 1.6891117 1.5630372
## [3263] 1.2034384 1.5100287 1.9727794 1.6045845 1.5659026 1.7020057 1.8696275
## [3270] 1.7822350 1.6475645 1.8753582 1.3151862 1.4369628 1.5673352 1.6991404
## [3277] 1.7249284 1.7234957 1.8223496 1.7664756 1.6661891 1.7277937 1.6647564
## [3284] 2.0286533 1.7564470 2.0042980 1.3753582 1.5358166 1.6905444 1.3595989
## [3291] 1.4169054 1.5171920 1.5587393 1.7263610 1.7893983 1.8696275 1.3137536
## [3298] 1.7693410 1.9240688 1.7378223 1.8051576 1.9312321 1.5257880 1.7048711
## [3305] 1.5802292 1.7650430 1.6762178 1.5988539 1.6146132 1.6361032 2.0272206
## [3312] 2.0974212 1.7707736 1.4828080 1.8424069 2.0143266 1.5558739 1.7220630
## [3319] 2.0573066 1.7077364 1.8710602 1.8295129 1.9813754 1.8939828 1.7005731
## [3326] 1.8710602 1.8409742 1.9011461 1.8553009 1.7263610 1.6332378 1.7593123
## [3333] 1.5372493 1.5372493 1.6848138 1.8452722 1.8123209 1.7951289 1.8237822
## [3340] 1.5057307 1.7578797 1.7779370 1.7406877 1.6762178 1.3724928 1.5888252
## [3347] 1.3008596 1.7134670 1.2707736 1.3896848 1.4570201 1.6303725 1.5429799
## [3354] 1.7822350 1.6045845 1.4742120 2.0429799 1.8739255 1.6232092 1.7636103
## [3361] 1.3237822 1.8896848 1.6504298 1.7177650 1.8123209 1.5329513 1.6189112
## [3368] 1.8037249 1.7263610 1.5415473 1.9140401 1.8653295 1.6876791 1.2722063
## [3375] 1.4570201 1.6446991 1.8037249 1.6489971 1.5515759 1.7550143 1.4727794
## [3382] 1.4942693 1.2306590 1.5386819 1.3266476 1.8352436 1.2722063 1.5888252
## [3389] 1.2220630 2.3102747 2.3774161 2.8453713 2.3814852 2.5493388 2.7080366
## [3396] 2.7110885 2.5829095 2.5818922 2.7049847 2.7833164 2.2899288 2.7355036
## [3403] 2.5523906 2.4221770 2.6408952 2.7497457 2.7822991 2.7171923 2.5228891
## [3410] 2.3357070 2.4109868 2.3611394 2.4618515 2.8504578 2.8463886 2.3296033
## [3417] 2.2909461 2.6215666 2.5127162 2.5045778 2.3652085 2.5981689 2.6388606
## [3424] 2.2858596 2.4801628 2.3296033 2.2848423 2.3692777 2.2838250 2.4476094
## [3431] 2.5279756 2.6510682 2.3855544 2.8514751 2.2675483 2.5289929 2.5361139
## [3438] 2.4435402 2.2594100 2.3357070 2.4069176 2.2939980 2.6500509 2.6826043
## [3445] 2.8494405 2.5635809 2.8026450 2.5045778 2.6144456 2.5310275 2.5249237
## [3452] 2.6795524 2.6083418 2.3947101 2.4099695 2.4608342 2.4537131 2.2512716
## [3459] 2.2807731 2.2777213 2.7263479 2.5849440 2.3967447 2.3733469 2.3631740
## [3466] 2.5320448 2.4140387 2.1515768 2.5645982 2.2960326 2.4374364 2.2166836
## [3473] 2.7507630 2.4099695 2.3631740 2.4811801 2.1973550 2.5818922 2.2441506
## [3480] 2.7802645 2.4486267 2.6327569 2.4872838 2.2746694 2.4435402 2.1922686
## [3487] 2.2950153 2.5452696 2.5981689 2.4669379 2.4262462 2.4272635 2.3234995
## [3494] 2.4883011 2.4008138 2.3265514 2.4791455 2.3733469 2.4404883 2.5442523
## [3501] 2.4272635 2.3540183 2.5025432 2.5350966 2.3967447 2.5249237 2.4435402
## [3508] 2.6144456 2.2807731 2.5991862 2.6632757 2.7965412 2.5869786 2.3112920
## [3515] 2.5096643 2.6531027 2.5208545 2.8179044 2.3652085 2.3702950 2.4120041
## [3522] 2.4435402 2.2044761 2.4160732 2.2685656 2.5554425 2.3296033 2.2726348
## [3529] 2.3326551 2.0752798 2.3245168 2.3621567 2.1515768 2.4557477 2.4577823
## [3536] 2.2960326 2.3845371 2.3387589 2.3855544 2.3845371 2.7263479 2.4730417
## [3543] 2.5574771 2.8524924 2.4089522 2.3468973 2.5981689 2.3072228 2.5116989
## [3550] 2.4984741 2.2980671 2.3479145 2.3763988 2.4649034 2.3987792 2.1861648
## [3557] 2.5747711 2.5289929 2.3285860 2.6317396 2.0325534 2.2970498 2.3245168
## [3564] 2.0356053 2.0956256 2.4791455 2.4008138 2.6164802 2.4760936 2.1953204
## [3571] 2.4791455 2.4262462 2.3672431 2.1424212 2.2309257 2.1190234 2.5798576
## [3578] 2.2889115 2.2258393 1.9918616 2.1851475 2.3987792 2.4598169 2.4211597
## [3585] 2.4160732 2.3601221 2.3743642 2.4018311 2.4659207 2.6541200 2.3763988
## [3592] 2.5239064 2.5025432 2.5076297 2.2034588 2.5737538 2.3519837 2.5350966
## [3599] 2.7548321 2.6368260 2.3336724 2.5717192 2.4150560 2.8118006 2.2858596
## [3606] 2.5605290 2.5483215 2.3357070 2.4048830 2.2156663 2.4252289 2.6917599
## [3613] 2.2258393 2.2929807 2.2014242 2.3336724 2.2889115 2.5066124 2.9206511
## [3620] 2.4608342 2.9715158 2.4211597 2.4618515 2.7131231 2.7802645 2.5930824
## [3627] 2.5981689 2.3631740 2.4079349 2.5768057 2.5015259 2.4598169 2.5096643
## [3634] 2.3519837 2.5381485 2.3194303 2.3234995 2.2695829 2.2126144 2.6113937
## [3641] 2.6775178 2.2695829 2.4689725 2.1637843 2.3509664 2.6093591 2.5350966
## [3648] 2.5178026 2.5493388 2.6327569 2.0579858 2.3184130 2.2919634 2.4211597
## [3655] 2.4099695 2.7426246 2.6195320 2.4883011 2.3652085 2.5656155 2.1719227
## [3662] 2.5096643 2.4038657 2.2950153 2.1892167 2.3468973 2.8026450 2.4872838
## [3669] 2.1220753 2.3997965 2.5442523 2.1495422 2.7853510 2.4811801 2.4476094
## [3676] 2.3540183 2.9562564 2.4455748 2.5239064 2.7822991 2.6602238 2.8351984
## [3683] 2.7293998 2.4170905 2.4933876 2.2482197 2.3062055 2.7141404 2.3123093
## [3690] 1.3915344 1.3425926 1.2552910 1.4272487 1.3425926 1.2129630 1.3253968
## [3697] 1.2989418 1.1481481 1.1084656 1.1931217 1.1428571 1.1812169 1.0634921
## [3704] 1.2619048 0.7830688 1.1587302 1.0992063 1.1455026 0.9444444 1.0925926
## [3711] 1.2301587 1.2116402 0.8769841 1.1534392 1.1375661 1.0502646 0.9947090
## [3718] 1.2500000 1.0608466 1.2261905 1.1891534 0.8425926 0.8399471 0.8730159
## [3725] 1.0753968 0.8505291 1.0211640 1.2314815 0.7619048 1.2738095 1.0740741
## [3732] 0.7976190 0.7301587 1.1137566 1.1970899 1.1812169 1.3822751 0.9113757
## [3739] 1.1388889 1.2579365 1.1164021 1.1560847 1.1388889 1.1706349 1.0740741
## [3746] 1.0661376 1.1137566 0.9087302 1.1005291 1.0727513 1.2129630 1.2037037
## [3753] 1.2566138 1.1878307 1.1574074 1.1507937 1.0621693 0.8571429 0.9722222
## [3760] 0.6693122 1.1759259 1.2566138 1.0079365 0.9669312 0.9312169 1.2275132
## [3767] 0.9047619 1.1984127 1.1891534 1.1507937 1.1216931 1.2592593 1.2116402
## [3774] 1.1944444 0.9629630 1.0515873 1.0820106 1.1865079 1.1203704 0.9748677
## [3781] 1.1521164 1.1018519 1.0687831 1.1600529 1.1124339 1.0753968 1.0105820
## [3788] 1.1388889 1.1521164 0.9642857 1.0886243 1.0396825 1.1970899 1.0119048
## [3795] 1.1269841 0.9867725 0.9246032 0.8214286 1.4047619 0.9603175 0.9100529
## [3802] 0.8240741 1.0780423 0.8029101 0.9285714 1.3029101 1.0317460 1.0939153
## [3809] 1.0132275 0.9219577 0.9682540 0.9166667 1.0277778 1.0753968 0.9894180
## [3816] 0.9338624 1.1772487 1.0039683 1.0634921 1.1957672 0.9960317 0.9246032
## [3823] 1.0687831 0.9682540 1.0648148 1.1296296 0.7619048 0.8690476 0.9060847
## [3830] 0.7433862 1.0727513 0.7711640 1.0727513 0.9179894 1.1388889 1.0992063
## [3837] 1.0489418 1.0978836 1.0158730 0.9708995 1.1031746 0.9775132 0.9060847
## [3844] 1.0013228 1.1335979 0.7473545 0.9140212 0.9642857 0.9775132 1.1785714
## [3851] 0.9259259 0.9404762 1.0846561 0.9007937 0.9510582 0.8955026 0.9365079
## [3858] 0.9828042 0.7632275 1.0026455 0.9642857 0.6944444 0.9126984 0.9682540
## [3865] 0.9404762 0.7711640 1.2367725 1.1825397 0.9444444 0.9325397 1.2658730
## [3872] 1.0145503 1.0105820 0.7843915 0.8240741 1.1230159 1.0542328 1.0621693
## [3879] 1.0330688 1.1084656 1.2235450 1.1216931 0.9404762 0.9920635 0.7619048
## [3886] 0.8822751 1.0595238 0.9378307 0.9484127 1.0978836 0.9497354 0.9894180
## [3893] 0.9563492 0.9523810 1.0026455 0.8584656 0.8809524 0.7976190 0.8452381
## [3900] 0.7103175 0.9933862 1.0304233 0.6666667 0.9113757 0.9603175 0.7486772
## [3907] 0.8002646 0.7103175 0.9232804 0.9563492 1.0952381 0.8862434 0.8492063
## [3914] 0.7791005 0.9206349 0.9510582 0.9656085 0.7632275 0.7182540 0.9722222
## [3921] 0.9537037 0.8280423 0.7896825 0.8029101 0.8174603 0.8888889 1.0370370
## [3928] 0.7976190 0.8822751 0.6785714 0.6997354 0.8055556 1.1613757 0.6851852
## [3935] 0.6970899 0.7962963 1.2764550 1.0436508 1.0317460 1.0740741 1.0423280
## [3942] 0.7910053 1.0132275 0.7764550 0.8955026 1.0264550 0.9788360 0.8941799
## [3949] 0.7354497 0.7089947 0.6997354 0.7605820 0.9246032 1.0052910 0.6772487
## [3956] 0.6838624 0.7103175 1.7308210 1.3930013 1.6379542 1.4427995 1.5181696
## [3963] 1.8842530 1.6729475 1.5154778 1.7698520 1.4535666 1.5114401 1.4387618
## [3970] 1.7012113 1.8829071 1.6393001 1.5666218 1.6218035 1.7281292 1.7025572
## [3977] 1.5706595 1.9946164 1.6837147 1.8021534 1.6541050 1.6864065 1.5989233
## [3984] 1.8721400 1.5464334 1.7833109 1.7711978 1.5558546 1.5316285 1.7496635
## [3991] 1.7052490 1.6796770 1.3633917 1.8075370 1.4602961 1.4427995 1.4724092
## [3998] 1.3889637 1.6675639 1.5652759 1.4535666 2.0080754 1.5087483 1.4858681
## [4005] 1.3243607 1.4279946 1.6527591 1.5531629 1.4239569 1.5154778 1.7846568
## [4012] 1.6083445 1.7429341 1.6298789 1.7092867 1.5787349 1.4966353 1.5881561
## [4019] 1.6500673 1.6164199 1.5302826 1.4131898 1.3378197 1.6393001 1.5841184
## [4026] 1.7563930 1.4858681 1.6864065 1.7846568 1.9407806 1.7927322 1.6944818
## [4033] 1.6393001 1.6514132 1.7981157 1.6137281 1.6621803 1.5935397 1.5087483
## [4040] 1.3149394 1.4118439 1.6164199 1.5598923 1.5087483 1.4602961 1.6433378
## [4047] 1.6204576 1.6527591 1.6796770 1.5450875 1.3701211 1.4212651 1.4764468
## [4054] 1.5531629 1.4266487 1.3189771 1.4764468 1.7052490 1.4010767 1.5195155
## [4061] 1.7577389 1.4952894 1.4764468 1.4037685 1.6877524 1.8183042 1.7375505
## [4068] 1.7092867 1.5504711 1.6150740 1.5397039 1.3364738 1.6137281 1.6783311
## [4075] 1.5679677 1.7617766 1.6164199 1.6419919 1.3553163 1.5141319 1.6931359
## [4082] 1.3862719 1.6890983 1.4508748 1.7913863 1.5114401 1.4131898 1.4091521
## [4089] 1.3660834 1.2045760 1.2570659 1.5302826 1.4818304 1.7792732 1.5827725
## [4096] 1.7039031 1.5989233 1.6110363 1.5612382 1.6029610 1.3566622 1.6366083
## [4103] 1.6312248 1.4199192 1.5760431 1.5060565 1.5975774 1.7039031 1.4522207
## [4110] 1.5545087 1.5598923 1.6069987 1.6312248 1.6433378 1.4858681 1.4549125
## [4117] 1.6083445 1.3351279 1.5787349 1.2624495 1.3189771 1.3526245 1.5558546
## [4124] 1.6500673 1.4993271 1.4347241 1.3432032 1.3916555 1.6406460 1.5773890
## [4131] 1.5881561 1.6567968 1.5074024 1.5356662 1.5114401 1.4212651 1.3553163
## [4138] 1.5652759 1.6137281 1.4239569 1.5047106 1.4495289 1.5639300 1.3216689
## [4145] 1.3458950 1.6231494 1.5329744 1.3310902 1.6285330 1.5181696 1.5720054
## [4152] 1.4468371 1.5666218 1.5114401 1.2113055 1.4239569 1.3418573 1.6864065
## [4159] 1.3445491 1.3593540 1.5518170 1.5652759 1.6379542 1.4683715 1.3580081
## [4166] 1.3566622 1.2947510 1.5868102 1.3337820 1.6271871 1.4710633 1.5477793
## [4173] 1.6446837 1.5154778 1.3512786 1.6312248 1.6029610 1.4441454 1.7012113
## [4180] 1.5787349 1.5908479 1.8358008 1.5962315 1.4226110 1.5383580 1.7187079
## [4187] 1.6123822 1.5854643 1.4952894 1.3916555 1.6460296 1.4185734 1.6164199
## [4194] 1.5545087 1.6904441 1.5491252 1.6514132 1.5531629 1.4724092 1.4912517
## [4201] 1.4683715 1.4791386 1.4279946 1.2960969 1.2920592 1.4253028 1.3526245
## [4208] 1.3324361 1.5423957 1.7483176 1.7981157 1.6567968 1.4374159 1.5814266
## [4215] 1.5020188 1.5881561 1.4629879 1.2395693 1.2274563 1.6191117 1.4401077
## [4222] 1.3647376 1.5154778 1.7294751 1.3660834 1.3324361 1.2207268 1.2449529
## [4229] 1.5127860 1.6648721 1.5208614 1.5639300 1.4899058 1.2947510 1.2987887
## [4236] 1.4912517 1.4993271 1.2947510 1.2382234 1.5289367 1.3055182 1.2624495
## [4243] 1.3458950 1.6069987 1.4589502 1.5531629 1.4683715 1.2745626 1.4051144
## [4250] 1.6110363 1.3082100 1.5881561 1.4145357 1.3432032 1.2826380 1.5477793
## [4257] 1.3512786 1.4131898 1.2247645 1.2341857 1.4037685 1.4993271 1.2449529
## [4264] 1.4468371 1.4522207 1.3364738 1.2126514 1.3109017 1.1440108 1.2516824
## [4271] 1.2368775 1.2247645 1.3176312 1.1359354 1.3041723 1.1399731 1.2570659
## [4278] 1.2987887 1.2072678 1.3230148 1.3432032 1.1938089 1.2193809 1.3391655
## [4285] 1.4791386 1.2530283 1.1803499 1.1480485 1.2664872 1.2368775 1.2234186
## [4292] 1.2920592 1.1991925 1.2624495 1.1534320 1.0632571 1.2086137 1.2301480
## [4299] 1.2328398 1.1224764 1.2072678 1.1359354 1.2314939 1.2664872 1.1857335
## [4306] 1.4253028 1.2570659 1.4899058 1.4374159 1.2234186 1.3068641 1.2920592
## [4313] 1.2853297 1.2934051 1.3741588 1.4091521 1.1144011 1.2664872 1.4333782
## [4320] 1.1938089 1.1480485 1.2274563 1.1628533 1.2987887 1.1924630 1.2247645
## [4327] 1.4266487 1.3876178 1.1790040 1.2059219 1.3243607 1.2651413 1.2288022
## [4334] 1.3216689 1.2382234 1.1170929 1.2516824 1.3364738 1.2772544 1.2059219
## [4341] 1.2032301 1.2207268 1.1534320 1.2624495 1.2409152 1.1520861 1.0323015
## [4348] 1.3903096 1.1776581 1.1601615 1.1426649 1.2557201 1.0511440 1.0794078
## [4355] 1.2678331 1.3620458 1.2449529 1.2476447 1.1924630 1.2274563 1.3391655
## [4362] 1.1223629 0.8691983 0.9535865 0.9071730 0.9725738 1.0506329 1.1244726
## [4369] 1.1814346 1.1687764 0.7932489 0.6350211 1.0168776 0.9430380 0.9282700
## [4376] 1.0253165 0.9156118 0.8797468 0.9008439 0.8670886 0.9556962 0.9303797
## [4383] 1.3016878 1.0864979 1.2552743 1.3691983 1.2784810 1.4746835 0.9873418
## [4390] 1.1350211 0.9831224 1.1075949 1.1666667 1.0253165 0.9810127 1.0042194
## [4397] 1.1772152 1.0274262 0.8143460 1.0021097 0.6877637 1.0421941 0.8966245
## [4404] 0.9746835 1.4873418 1.0021097 0.8206751 0.9198312 0.8818565 0.6666667
## [4411] 1.5042194 0.8438819 1.4831224 1.0084388 1.0864979 0.9261603 1.2067511
## [4418] 0.7088608 0.7088608 1.1877637 1.1244726 1.0464135 1.0886076 0.9556962
## [4425] 1.3417722 0.9535865 1.1983122 1.0843882 0.8438819 0.8776371 0.9556962
## [4432] 1.1835443 1.0717300 0.9556962 1.1139241 1.0548523 1.0168776 0.9873418
## [4439] 1.2953586 1.3375527 1.1054852 0.9725738 1.0210970 1.1645570 1.1392405
## [4446] 1.4409283 1.0358650 1.0316456 0.9767932 0.7805907 0.9725738 1.3839662
## [4453] 0.9345992 1.1223629 0.8459916 1.0400844 0.8607595 0.9978903 1.0949367
## [4460] 1.0000000 1.1223629 1.0717300 1.3396624 1.1286920 1.3586498 1.0801688
## [4467] 1.2890295 1.2911392 1.1940928 1.0358650 1.1286920 1.2088608 0.8565401
## [4474] 1.1160338 1.0274262 0.9388186 1.2784810 1.4219409 1.0717300 1.0210970
## [4481] 1.2510549 1.0506329 0.9324895 1.3734177 1.6054852 1.2869198 1.4451477
## [4488] 1.4050633 1.1898734 1.0886076 1.0337553 1.4578059 1.1708861 1.2215190
## [4495] 0.9831224 1.0000000 0.7869198 0.9704641 1.0168776 1.0464135 1.2805907
## [4502] 1.5316456 1.3396624 1.3797468 1.3734177 1.2784810 1.0337553 1.0316456
## [4509] 0.8607595 1.4029536 1.3966245 1.3691983 1.1751055 1.0253165 0.6392405
## [4516] 0.8312236 1.1286920 0.7004219 0.5485232 1.0464135 0.9514768 1.0464135
## [4523] 1.1413502 0.7172996 0.9852321 0.9029536 0.8713080 1.0864979 0.9240506
## [4530] 1.0907173 1.1561181 1.1666667 0.8481013 0.9261603 1.0443038 1.2953586
## [4537] 0.9662447 0.7362869 0.8565401 0.9008439 1.0168776 1.2130802 1.0126582
## [4544] 1.0632911 1.3987342 0.9451477 0.8206751 0.8459916 1.2320675 0.9282700
## [4551] 0.8502110 0.8565401 1.1561181 1.1181435 0.7616034 1.1181435 0.7362869
## [4558] 1.2510549 1.2426160 0.9894515 1.1097046 0.9599156 1.0632911 1.4873418
## [4565] 1.4219409 1.3628692 1.2362869 0.9978903 1.1350211 1.4135021 1.3270042
## [4572] 1.1181435 1.3839662 1.3966245 1.0232068 1.1118143 1.1497890 1.0295359
## [4579] 1.2341772 1.2995781 0.6835443 0.9409283 0.9071730 1.0485232 0.8924051
## [4586] 1.2805907 0.8354430 1.2594937 0.7046414 0.8459916 1.4810127 0.6919831
## [4593] 0.7552743 1.1645570 0.8734177 1.0527426 1.3670886 0.9282700 0.9810127
## [4600] 0.9240506 0.7130802 0.9029536 0.7637131 1.0443038 0.9451477 0.8291139
## [4607] 0.9071730 0.8164557 0.9746835 0.8586498 0.9092827 0.8059072 0.9493671
## [4614] 0.9873418 0.9198312 1.2637131 0.8902954 0.9493671 0.9683544 1.0400844
## [4621] 0.9324895 1.0632911 0.7911392 1.1877637 0.8839662 0.9367089 1.0210970
## [4628] 0.7447257 1.1561181 1.1202532 0.7974684 0.9282700 0.7974684 1.1983122
## [4635] 0.8375527 0.9388186 0.8628692 1.1856540 0.7869198 1.1350211 0.8670886
## [4642] 1.2336283 1.1079646 1.3238938 1.0070796 1.1699115 1.4000000 1.3681416
## [4649] 1.5575221 1.4920354 1.3132743 1.4123894 1.2867257 1.4230088 1.4371681
## [4656] 1.4265487 1.1309735 1.3026549 1.2867257 1.2495575 1.4123894 1.3026549
## [4663] 1.2707965 1.1610619 1.2212389 1.1840708 1.3150442 1.3274336 1.3699115
## [4670] 1.1274336 1.3592920 1.3787611 1.2690265 1.1362832 1.4230088 1.3132743
## [4677] 1.2867257 1.1734513 1.0637168 1.1150442 0.9681416 1.0389381 0.8955752
## [4684] 1.0690265 1.2212389 1.2814159 1.2194690 1.2884956 1.4141593 1.2991150
## [4691] 1.2566372 1.0495575 1.3079646 1.2955752 1.4230088 1.2460177 0.9752212
## [4698] 1.3893805 1.3044248 1.4831858 1.1610619 1.4088496 1.2353982 1.4460177
## [4705] 1.3663717 1.2159292 1.3646018 1.3309735 1.2371681 1.3681416 1.1203540
## [4712] 1.1008850 1.4849558 1.4070796 1.0973451 1.4690265 1.4318584 1.2353982
## [4719] 1.4106195 0.8247788 0.9787611 1.1964602 1.4601770 1.2938053 1.2477876
## [4726] 1.1628319 1.2743363 1.3079646 0.9929204 1.0938053 1.2000000 1.4035398
## [4733] 1.4938053 1.2849558 1.4176991 1.3946903 1.2088496 1.5734513 1.0867257
## [4740] 1.0761062 1.0424779 1.3699115 1.2194690 1.0389381 1.1752212 1.2867257
## [4747] 1.1115044 1.1699115 0.9805310 1.2371681 1.2477876 1.1150442 1.2088496
## [4754] 1.1575221 0.9097345 1.0353982 1.0477876 0.8601770 1.2318584 1.3752212
## [4761] 1.1592920 1.3734513 1.0194690 1.0212389 0.8938053 1.1044248 0.9185841
## [4768] 1.0460177 1.0938053 1.0212389 1.0389381 0.9256637 1.1008850 1.0761062
## [4775] 1.1185841 1.0513274 1.3805310 1.0814159 1.2796460 1.3061947 0.9575221
## [4782] 1.1079646 1.1699115 1.1256637 1.0017699 1.2336283 1.3274336 0.9929204
## [4789] 0.9699115 1.1716814 0.9805310 1.2495575 0.9026549 1.1469027 0.9469027
## [4796] 1.2247788 1.0902655 1.3292035 0.7646018 1.1539823 0.9646018 0.7893805
## [4803] 0.9274336 0.6849558 1.2194690 0.9433628 1.1893805 1.2654867 0.8176991
## [4810] 1.0707965 1.1079646 0.8973451 0.7840708 0.9876106 1.4000000 0.8053097
## [4817] 1.3238938 1.2690265 1.0017699 0.9769912 1.0336283 1.0230088 1.3398230
## [4824] 1.2088496 1.4141593 1.3292035 1.3044248 0.9398230 0.9185841 1.1238938
## [4831] 0.9486726 1.0176991 1.0106195 1.1221239 1.4725664 1.0920354 1.0884956
## [4838] 1.3150442 1.0831858 0.8707965 1.3079646 1.0831858 0.9309735 0.7451327
## [4845] 0.8778761 1.1787611 1.0530973 0.7504425 0.9610619 0.9752212 0.8353982
## [4852] 0.8548673 0.9787611 1.0389381 0.8796460 1.0796460 1.2053097 1.3557522
## [4859] 1.3238938 0.9345133 1.2530973 0.9203540 1.0141593 1.4230088 0.9858407
## [4866] 1.1981651 1.3165138 1.1477064 1.0330275 1.1504587 1.0917431 1.2073394
## [4873] 1.1192661 1.4064220 1.2981651 1.1541284 1.5284404 1.2027523 1.3688073
## [4880] 1.2394495 1.0871560 1.1431193 0.9752294 1.1119266 1.2091743 0.9770642
## [4887] 1.0055046 1.1532110 1.2073394 1.3541284 1.1247706 1.3834862 1.2779817
## [4894] 1.3128440 1.2266055 1.2302752 1.0376147 1.0229358 1.2467890 1.3357798
## [4901] 1.1944954 1.2321101 1.2605505 1.0733945 1.0577982 1.3412844 1.1146789
## [4908] 1.2972477 1.1541284 1.1844037 1.2798165 1.2706422 0.9853211 1.1467890
## [4915] 1.1779817 1.0165138 1.1422018 1.1697248 1.3073394 1.2440367 1.2165138
## [4922] 1.0880734 0.9486239 1.1853211 1.0458716 0.8908257 1.1917431 0.9990826
## [4929] 1.0211009 0.7990826 1.0330275 1.2321101 1.3623853 1.1412844 1.0256881
## [4936] 1.1697248 1.2174312 1.0357798 1.1183486 1.2550459 1.0880734 1.3284404
## [4943] 1.3513761 1.0137615 1.0311927 1.2091743 1.2036697 1.0339450 1.1550459
## [4950] 1.2559633 1.0247706 0.7963303 0.9853211 1.0577982 1.2055046 1.0770642
## [4957] 1.1559633 1.0422018 0.7587156 1.0669725 1.3247706 1.1623853 1.1981651
## [4964] 1.1055046 1.0348624 1.1422018 1.1853211 0.9027523 1.3128440 1.1247706
## [4971] 1.0899083 1.1339450 1.2137615 1.0073394 1.3220183 1.2339450 0.8908257
## [4978] 0.9009174 0.7889908 0.9697248 1.0697248 1.1642202 1.2990826 0.9917431
## [4985] 1.0376147 1.1935780 1.0018349 0.9348624 1.0183486 1.1477064 0.9036697
## [4992] 1.2376147 1.0211009 1.2678899 1.3119266 1.2385321 1.1045872 1.0091743
## [4999] 1.0339450 1.1311927 1.2642202 1.0165138 0.8495413 1.1220183 1.0183486
## [5006] 1.3623853 1.1403670 1.0027523 1.0183486 0.9715596 1.1596330 0.9889908
## [5013] 1.0412844 1.0440367 1.1871560 1.0844037 1.0559633 1.0220183 1.2366972
## [5020] 1.0981651 1.1724771 1.2394495 1.1541284 1.2596330 1.2155963 0.9082569
## [5027] 1.1935780 1.0568807 1.1128440 1.2036697 1.0889908 0.8422018 1.1357798
## [5034] 0.8633028 1.0660550 0.9908257 0.9183486 1.1000000 0.9110092 1.2422018
## [5041] 1.0458716 0.9981651 1.1000000 1.0055046 0.9495413 0.9027523 1.0009174
## [5048] 0.9834862 1.1119266 1.1871560 0.9761468 1.0412844 1.2559633 0.9816514
## [5055] 1.1963303 1.1807339 0.8486239 0.9577982 1.1018349 0.6871560 0.8633028
## [5062] 1.0137615 1.1816514 1.1321101 0.8724771 0.8266055 0.7477064 1.3201835
## [5069] 1.2192661 0.8798165 1.0770642 0.9412844 0.8284404 0.7568807 0.7697248
## [5076] 0.8220183 0.7036697 1.0788991 0.7238532 0.6935780 0.9889908
#differentiating these avg spike counts per session
aveg_spikes <- function(session_index) {
    avg_spikes <- numeric(0)
    this_session <- session[[session_index]]
    
    for (j in 1:length(this_session$spks)) {
        spks_trial <- this_session$spks[[j]]
        total_spikes <- apply(spks_trial, 1, sum)
        avg_spikes <- c(avg_spikes, mean(total_spikes))
    }
    
    return(avg_spikes)
}

aveg_spikes_s1 <- aveg_spikes(1)


#Density plot of avg spike counts
plot(density(avg_spikes), xlab = "Average Spike Counts", ylab = "Density", main = "Density Plot of Average Spike Counts")

#Histogram
hist(avg_spikes, breaks = 20, xlab = "Average Spike Counts", ylab = "Frequency", main = "Distribution of Average Spike Counts")

#Line plot
plot(avg_spikes, type = "l", xlab = "Trials", ylab = "Average Spike Counts", main = "Average Spike Counts")

#performing clustering with three clusters
k <- 3  
set.seed(123) 
clusters <- kmeans(avg_spikes, centers = k)


cluster_labels <- clusters$cluster
clusters
## K-means clustering with 3 clusters of sizes 2013, 2223, 845
## 
## Cluster means:
##        [,1]
## 1 0.9262672
## 2 1.4459524
## 3 2.2303109
## 
## Clustering vector:
##    [1] 2 2 3 2 2 1 3 2 2 2 2 3 2 2 2 2 2 3 3 3 2 2 2 2 2 3 3 3 1 3 2 2 2 2 3 3 2
##   [38] 2 1 2 2 2 2 3 2 3 2 2 2 2 3 3 2 2 2 3 2 2 2 3 3 2 2 2 2 2 2 2 2 2 2 1 2 2
##   [75] 2 2 2 2 2 1 2 2 2 3 2 2 2 2 2 2 2 2 2 2 2 2 2 2 3 2 2 2 2 2 2 2 2 2 2 2 2
##  [112] 1 2 2 2 1 2 2 2 1 2 2 1 1 2 2 2 2 2 2 2 2 2 2 2 2 2 2 1 2 1 2 2 2 2 2 2 2
##  [149] 1 2 2 2 1 2 2 2 2 2 2 2 2 2 1 2 2 2 2 2 2 2 2 2 2 1 2 2 1 2 2 2 2 2 2 1 2
##  [186] 1 2 1 2 1 2 1 2 2 2 2 2 2 2 2 2 2 1 1 1 1 1 1 2 1 1 1 1 2 2 1 2 1 2 2 2 1
##  [223] 1 2 2 2 1 2 2 2 2 2 1 2 2 1 2 1 1 1 2 1 2 2 2 1 1 2 1 1 1 2 1 1 2 1 2 2 1
##  [260] 2 2 1 1 1 1 2 1 2 2 2 2 1 2 2 2 1 2 2 2 2 2 2 1 1 2 1 2 2 1 2 2 2 1 2 2 2
##  [297] 1 2 1 2 1 2 2 1 2 1 1 1 2 1 1 2 1 2 2 2 2 2 2 2 1 2 2 1 2 2 1 2 2 2 1 2 2
##  [334] 1 1 2 2 1 2 1 1 1 2 2 1 1 2 2 2 2 2 2 2 2 2 2 1 1 1 2 1 2 1 2 2 3 3 3 3 3
##  [371] 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 2 3 3 3 3 3 3 3 3 3 3 3
##  [408] 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3
##  [445] 3 3 3 3 3 3 3 3 3 3 3 3 3 2 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 2 3 2
##  [482] 3 3 3 3 3 3 3 3 3 2 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 2 3 3 3 3 3 3 3 3 3 3
##  [519] 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 2 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3
##  [556] 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 2 3 3 3 3 3 2 3 3 3 3 3 3
##  [593] 3 1 1 1 2 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1
##  [630] 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1
##  [667] 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1
##  [704] 1 1 2 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1
##  [741] 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1
##  [778] 1 1 2 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1
##  [815] 1 1 1 1 1 2 1 1 1 1 1 1 1 1 1 1 1 1 2 1 1 1 1 1 1 1 1 1 2 1 2 1 2 2 2 2 2
##  [852] 2 1 2 1 2 2 2 2 2 1 2 2 2 2 1 2 2 2 2 1 1 1 2 2 2 2 1 2 2 2 1 2 1 1 2 2 1
##  [889] 2 1 1 2 1 1 1 1 1 1 1 2 2 1 1 1 2 1 2 2 2 2 1 1 1 1 2 1 2 1 1 2 1 2 2 2 1
##  [926] 2 1 1 2 2 2 2 1 2 1 1 2 1 2 2 2 1 2 1 1 1 2 2 1 1 1 2 2 2 2 2 1 2 2 2 2 2
##  [963] 1 1 2 2 1 1 1 2 2 2 1 2 1 1 1 1 1 1 2 2 2 2 1 1 1 1 1 2 1 1 2 2 2 2 1 1 1
## [1000] 2 1 1 1 1 2 1 2 2 1 2 1 1 2 1 1 2 1 1 1 1 1 1 2 1 1 1 1 1 1 1 1 1 1 2 1 2
## [1037] 1 2 2 1 1 1 1 1 1 1 2 1 1 2 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1
## [1074] 1 1 2 1 1 1 1 1 1 2 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1
## [1111] 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1
## [1148] 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1
## [1185] 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1
## [1222] 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1
## [1259] 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1
## [1296] 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1
## [1333] 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1
## [1370] 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 2 1 1 2 2 1 1 1 2 2 2 2 1 2 3
## [1407] 1 2 2 1 1 1 1 2 2 2 2 2 1 2 3 2 1 1 2 1 3 2 2 2 2 1 2 2 2 1 1 1 3 3 1 2 2
## [1444] 3 1 1 2 2 2 2 1 2 2 2 3 2 1 1 2 2 3 2 2 2 1 2 2 2 1 3 2 1 2 2 2 2 1 2 3 3
## [1481] 1 3 3 1 2 3 2 3 2 2 2 1 3 2 2 3 2 2 2 2 2 2 2 2 3 2 2 2 2 2 2 1 2 2 1 2 2
## [1518] 2 2 2 2 2 3 2 1 2 2 2 2 2 1 2 2 2 2 2 2 2 1 2 1 2 2 2 2 2 2 1 2 2 2 2 2 2
## [1555] 2 2 2 2 2 2 3 2 2 1 3 2 2 2 2 2 1 2 3 2 2 2 1 2 2 2 2 2 2 2 2 2 2 1 2 2 2
## [1592] 2 2 2 2 2 1 1 2 2 2 2 3 2 1 2 2 1 1 1 2 1 2 1 2 2 1 2 1 2 2 2 2 2 1 2 2 1
## [1629] 1 2 2 2 2 2 2 1 2 2 1 2 1 3 2 2 3 2 3 2 3 3 2 3 3 3 2 3 3 3 3 2 2 2 3 2 2
## [1666] 2 3 2 2 3 2 2 3 2 2 2 2 1 2 2 3 1 2 2 2 3 1 3 2 2 2 2 2 2 3 2 3 2 3 2 3 3
## [1703] 3 2 2 3 2 2 3 2 3 2 2 3 2 3 2 2 3 2 2 1 2 2 3 2 2 3 3 3 3 2 3 2 2 2 3 2 3
## [1740] 2 2 2 1 3 1 2 3 2 3 1 3 2 3 3 3 2 3 2 3 3 3 3 2 3 2 2 1 1 2 3 3 3 2 3 3 3
## [1777] 3 3 3 2 2 3 3 2 3 3 3 2 2 3 3 1 3 2 3 3 2 3 3 2 3 2 2 2 1 3 3 3 2 2 3 3 2
## [1814] 3 2 3 3 2 1 2 2 2 2 3 1 3 3 2 1 2 1 2 1 1 1 1 2 2 2 2 1 1 2 3 2 2 2 2 1 2
## [1851] 2 3 2 1 2 1 2 3 2 3 2 3 2 1 2 2 2 1 2 1 1 1 2 3 2 2 2 2 1 1 2 1 2 1 2 2 2
## [1888] 1 2 3 2 3 2 2 3 2 2 2 3 3 2 2 2 2 2 2 3 2 1 2 2 2 2 2 2 2 1 2 2 2 2 3 2 2
## [1925] 2 2 3 3 2 2 3 3 2 3 3 2 2 2 3 3 2 2 2 1 2 1 2 3 1 2 2 3 2 2 1 3 1 2 1 3 2
## [1962] 2 1 3 2 3 3 2 2 2 2 3 2 2 2 2 2 1 1 2 2 2 2 2 2 2 2 2 2 3 2 2 2 2 2 2 2 3
## [1999] 3 2 2 2 2 2 2 2 2 3 2 1 3 2 2 2 2 3 2 2 2 2 3 2 2 2 3 2 2 2 1 2 2 2 2 2 3
## [2036] 3 2 2 2 2 2 3 3 2 2 2 1 2 3 2 2 2 2 2 2 3 2 2 2 3 3 3 3 2 2 3 2 2 3 2 2 2
## [2073] 2 2 3 2 2 2 3 2 2 2 2 3 2 2 3 3 2 2 3 2 2 2 2 2 2 2 3 2 3 3 2 2 2 3 1 2 2
## [2110] 2 2 2 2 3 3 2 2 2 2 3 2 2 2 1 2 2 3 2 2 2 2 3 2 3 2 1 3 3 2 3 3 2 2 2 2 2
## [2147] 3 2 2 2 2 2 2 3 2 2 2 1 2 3 2 2 2 2 2 3 2 2 2 2 2 2 3 3 2 2 2 3 3 3 2 2 2
## [2184] 2 2 2 2 1 2 2 3 1 1 2 2 2 1 3 2 2 2 2 2 2 2 3 2 2 1 2 2 2 2 2 2 2 3 2 2 2
## [2221] 2 2 2 2 3 2 2 3 2 1 3 2 2 2 3 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 1 2
## [2258] 2 2 2 1 1 1 1 1 1 2 1 2 2 1 2 2 2 2 1 2 1 2 1 2 2 2 2 1 2 2 2 1 1 1 2 2 1
## [2295] 2 2 2 1 2 1 2 2 1 2 2 2 2 1 1 2 1 2 2 1 1 2 2 2 2 2 2 2 1 1 2 2 1 2 1 2 2
## [2332] 1 2 1 1 2 1 1 1 1 1 2 1 1 1 1 1 2 2 2 2 2 1 1 2 2 2 1 2 2 2 2 2 1 2 2 2 1
## [2369] 1 1 2 2 2 2 2 2 2 1 2 1 2 1 2 2 1 2 1 2 1 2 2 2 2 2 1 1 2 1 2 1 1 2 2 2 1
## [2406] 2 2 2 2 2 2 2 2 2 2 1 1 2 2 2 2 2 2 1 1 1 1 1 2 1 2 1 2 1 1 1 2 2 2 1 1 2
## [2443] 2 2 1 1 2 1 2 1 2 1 1 2 1 2 2 2 1 2 2 1 1 2 2 2 2 2 1 1 2 2 2 1 2 1 2 2 2
## [2480] 2 2 2 2 1 2 2 1 1 2 2 1 1 2 2 2 2 2 2 1 2 1 2 2 1 1 1 2 2 1 1 2 1 1 1 2 2
## [2517] 2 1 1 2 1 2 1 2 2 2 2 1 2 2 2 2 2 1 1 1 1 1 2 2 1 1 1 2 1 1 2 1 1 1 1 1 1
## [2554] 2 1 1 2 1 2 2 1 2 1 1 2 2 1 2 1 1 1 1 1 2 1 1 1 2 1 2 1 1 2 1 2 2 2 2 2 2
## [2591] 1 1 2 2 1 1 1 1 2 1 1 2 1 1 2 2 2 1 2 1 1 1 1 1 1 2 1 2 1 1 2 2 1 2 2 1 1
## [2628] 2 1 2 2 1 1 1 2 2 1 1 1 2 2 1 2 2 1 2 1 1 2 2 1 1 1 2 1 1 1 1 1 1 2 1 1 2
## [2665] 1 1 1 2 1 1 1 1 1 1 1 1 2 1 2 1 1 1 2 2 1 1 1 1 2 2 1 2 2 1 2 1 1 2 1 1 1
## [2702] 1 2 1 2 1 1 2 1 2 2 2 2 2 3 2 3 2 1 1 3 2 2 2 1 2 1 3 3 1 1 2 1 1 1 2 2 3
## [2739] 3 2 3 1 2 2 1 1 1 2 2 1 2 1 1 2 1 3 1 3 2 3 3 2 2 1 1 1 1 3 2 3 2 2 1 2 3
## [2776] 1 2 3 2 2 1 2 3 1 1 2 2 2 1 1 1 1 1 3 2 2 1 2 1 2 1 3 2 2 2 3 3 1 2 3 1 1
## [2813] 2 1 2 1 1 1 2 3 3 1 2 2 3 3 2 2 3 1 1 2 1 2 1 2 2 2 2 2 1 3 2 1 2 1 1 2 2
## [2850] 1 2 1 3 2 2 2 2 1 1 3 2 2 3 2 2 1 1 2 2 2 1 2 2 2 2 1 1 1 3 1 2 1 2 2 1 3
## [2887] 2 1 3 2 2 2 2 1 1 2 1 2 2 1 1 2 1 1 2 1 1 2 1 2 2 1 1 1 3 1 2 2 1 2 1 2 2
## [2924] 2 1 2 2 1 2 2 2 2 1 2 2 1 2 1 1 1 2 1 2 2 1 1 1 1 2 2 2 1 1 1 1 1 1 2 1 2
## [2961] 2 1 1 1 1 2 2 2 3 1 2 3 2 2 2 1 1 1 2 1 1 2 2 1 3 2 1 1 1 2 1 2 1 3 1 2 2
## [2998] 1 2 3 1 1 1 1 1 2 1 1 1 1 1 1 1 1 1 1 1 1 2 1 2 1 1 1 2 1 1 1 3 1 1 1 1 1
## [3035] 2 1 1 1 2 1 2 1 1 1 1 1 1 1 1 2 3 2 3 3 3 2 3 3 3 3 3 3 2 2 2 2 3 2 2 3 2
## [3072] 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 3 2 2 2 2 2 3 3 2 3 2 2 2 2 2 2
## [3109] 2 2 2 2 2 2 3 2 2 2 2 2 2 2 3 2 2 2 2 3 2 2 2 2 2 2 2 2 2 2 2 3 2 2 2 2 2
## [3146] 2 3 2 2 2 2 3 2 2 2 2 2 2 2 3 3 2 2 2 2 2 2 3 2 2 2 2 2 2 2 2 2 2 2 3 3 2
## [3183] 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 3 2 2 2 2 2 2 2 3 2 2 2 2 2 2 2 2 2 2 2 2
## [3220] 2 2 2 3 3 2 2 2 2 2 2 2 2 2 2 2 1 2 3 2 2 3 3 2 2 2 2 3 3 3 2 2 3 2 2 2 2
## [3257] 1 2 2 2 2 2 2 2 3 2 2 2 3 2 2 3 2 2 2 2 2 2 2 2 2 2 2 3 2 3 2 2 2 2 2 2 2
## [3294] 2 2 3 2 2 3 2 2 3 2 2 2 2 2 2 2 2 3 3 2 2 3 3 2 2 3 2 3 2 3 3 2 3 3 3 3 2
## [3331] 2 2 2 2 2 3 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 3 3 2 2 2 3 2 2 2 2 2
## [3368] 2 2 2 3 3 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3
## [3405] 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3
## [3442] 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3
## [3479] 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3
## [3516] 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3
## [3553] 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3
## [3590] 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3
## [3627] 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3
## [3664] 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 2 2 2 2 2 2 2 2 1 1 2
## [3701] 1 1 1 2 1 1 1 1 1 1 2 2 1 1 1 1 1 2 1 2 2 1 1 1 1 1 1 2 1 2 1 1 1 1 2 1 2
## [3738] 1 1 2 1 1 1 1 1 1 1 1 1 1 2 2 2 2 1 1 1 1 1 1 1 2 1 1 1 2 1 2 2 1 1 2 2 2
## [3775] 1 1 1 2 1 1 1 1 1 1 1 1 1 1 1 1 1 1 2 1 1 1 1 1 2 1 1 1 1 1 1 2 1 1 1 1 1
## [3812] 1 1 1 1 1 1 1 1 2 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1
## [3849] 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 2 1 1 1 2 1 1 1 1 1 1 1 1 1 2 1 1 1 1
## [3886] 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1
## [3923] 1 1 1 1 1 1 1 1 1 1 1 1 1 1 2 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 2 2
## [3960] 2 2 2 3 2 2 2 2 2 2 2 3 2 2 2 2 2 2 3 2 2 2 2 2 3 2 2 2 2 2 2 2 2 2 2 2 2
## [3997] 2 2 2 2 2 3 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 3 2 2 2
## [4034] 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2
## [4071] 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2
## [4108] 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2
## [4145] 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2
## [4182] 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2
## [4219] 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2
## [4256] 2 2 2 2 2 2 2 2 2 2 2 2 2 1 2 2 2 2 1 2 1 2 2 2 2 2 2 2 2 2 2 1 1 2 2 2 2
## [4293] 2 2 1 1 2 2 2 1 2 1 2 2 1 2 2 2 2 2 2 2 2 2 2 2 1 2 2 2 1 2 1 2 2 2 2 2 1
## [4330] 2 2 2 2 2 2 1 2 2 2 2 2 2 1 2 2 1 1 2 1 1 1 2 1 1 2 2 2 2 2 2 2 1 1 1 1 1
## [4367] 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 2 1 2 2 2 2 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1
## [4404] 1 2 1 1 1 1 1 2 1 2 1 1 1 2 1 1 2 1 1 1 1 2 1 2 1 1 1 1 1 1 1 1 1 1 1 2 2
## [4441] 1 1 1 1 1 2 1 1 1 1 1 2 1 1 1 1 1 1 1 1 1 1 2 1 2 1 2 2 2 1 1 2 1 1 1 1 2
## [4478] 2 1 1 2 1 1 2 2 2 2 2 2 1 1 2 1 2 1 1 1 1 1 1 2 2 2 2 2 2 1 1 1 2 2 2 1 1
## [4515] 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 2 1 1 1 1 1 2 1 1 2 1 1 1 2 1 1
## [4552] 1 1 1 1 1 1 2 2 1 1 1 1 2 2 2 2 1 1 2 2 1 2 2 1 1 1 1 2 2 1 1 1 1 1 2 1 2
## [4589] 1 1 2 1 1 1 1 1 2 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 2 1 1 1 1 1 1 1 2 1
## [4626] 1 1 1 1 1 1 1 1 2 1 1 1 1 1 1 1 2 1 2 1 1 2 2 2 2 2 2 2 2 2 2 1 2 2 2 2 2
## [4663] 2 1 2 1 2 2 2 1 2 2 2 1 2 2 2 1 1 1 1 1 1 1 2 2 2 2 2 2 2 1 2 2 2 2 1 2 2
## [4700] 2 1 2 2 2 2 2 2 2 2 2 1 1 2 2 1 2 2 2 2 1 1 2 2 2 2 1 2 2 1 1 2 2 2 2 2 2
## [4737] 2 2 1 1 1 2 2 1 1 2 1 1 1 2 2 1 2 1 1 1 1 1 2 2 1 2 1 1 1 1 1 1 1 1 1 1 1
## [4774] 1 1 1 2 1 2 2 1 1 1 1 1 2 2 1 1 1 1 2 1 1 1 2 1 2 1 1 1 1 1 1 2 1 2 2 1 1
## [4811] 1 1 1 1 2 1 2 2 1 1 1 1 2 2 2 2 2 1 1 1 1 1 1 1 2 1 1 2 1 1 2 1 1 1 1 1 1
## [4848] 1 1 1 1 1 1 1 1 1 2 2 2 1 2 1 1 2 1 2 2 1 1 1 1 2 1 2 2 1 2 2 2 2 1 1 1 1
## [4885] 2 1 1 1 2 2 1 2 2 2 2 2 1 1 2 2 2 2 2 1 1 2 1 2 1 1 2 2 1 1 1 1 1 1 2 2 2
## [4922] 1 1 1 1 1 2 1 1 1 1 2 2 1 1 1 2 1 1 2 1 2 2 1 1 2 2 1 1 2 1 1 1 1 2 1 1 1
## [4959] 1 1 2 1 2 1 1 1 1 1 2 1 1 1 2 1 2 2 1 1 1 1 1 1 2 1 1 2 1 1 1 1 1 2 1 2 2
## [4996] 2 1 1 1 1 2 1 1 1 1 2 1 1 1 1 1 1 1 1 2 1 1 1 2 1 1 2 1 2 2 1 2 1 1 2 1 1
## [5033] 1 1 1 1 1 1 1 2 1 1 1 1 1 1 1 1 1 2 1 1 2 1 2 1 1 1 1 1 1 1 1 1 1 1 1 2 2
## [5070] 1 1 1 1 1 1 1 1 1 1 1 1
## 
## Within cluster sum of squares by cluster:
## [1] 66.97687 73.69883 60.09831
##  (between_SS / total_SS =  83.7 %)
## 
## Available components:
## 
## [1] "cluster"      "centers"      "totss"        "withinss"     "tot.withinss"
## [6] "betweenss"    "size"         "iter"         "ifault"
cluster_labels
##    [1] 2 2 3 2 2 1 3 2 2 2 2 3 2 2 2 2 2 3 3 3 2 2 2 2 2 3 3 3 1 3 2 2 2 2 3 3 2
##   [38] 2 1 2 2 2 2 3 2 3 2 2 2 2 3 3 2 2 2 3 2 2 2 3 3 2 2 2 2 2 2 2 2 2 2 1 2 2
##   [75] 2 2 2 2 2 1 2 2 2 3 2 2 2 2 2 2 2 2 2 2 2 2 2 2 3 2 2 2 2 2 2 2 2 2 2 2 2
##  [112] 1 2 2 2 1 2 2 2 1 2 2 1 1 2 2 2 2 2 2 2 2 2 2 2 2 2 2 1 2 1 2 2 2 2 2 2 2
##  [149] 1 2 2 2 1 2 2 2 2 2 2 2 2 2 1 2 2 2 2 2 2 2 2 2 2 1 2 2 1 2 2 2 2 2 2 1 2
##  [186] 1 2 1 2 1 2 1 2 2 2 2 2 2 2 2 2 2 1 1 1 1 1 1 2 1 1 1 1 2 2 1 2 1 2 2 2 1
##  [223] 1 2 2 2 1 2 2 2 2 2 1 2 2 1 2 1 1 1 2 1 2 2 2 1 1 2 1 1 1 2 1 1 2 1 2 2 1
##  [260] 2 2 1 1 1 1 2 1 2 2 2 2 1 2 2 2 1 2 2 2 2 2 2 1 1 2 1 2 2 1 2 2 2 1 2 2 2
##  [297] 1 2 1 2 1 2 2 1 2 1 1 1 2 1 1 2 1 2 2 2 2 2 2 2 1 2 2 1 2 2 1 2 2 2 1 2 2
##  [334] 1 1 2 2 1 2 1 1 1 2 2 1 1 2 2 2 2 2 2 2 2 2 2 1 1 1 2 1 2 1 2 2 3 3 3 3 3
##  [371] 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 2 3 3 3 3 3 3 3 3 3 3 3
##  [408] 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3
##  [445] 3 3 3 3 3 3 3 3 3 3 3 3 3 2 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 2 3 2
##  [482] 3 3 3 3 3 3 3 3 3 2 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 2 3 3 3 3 3 3 3 3 3 3
##  [519] 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 2 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3
##  [556] 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 2 3 3 3 3 3 2 3 3 3 3 3 3
##  [593] 3 1 1 1 2 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1
##  [630] 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1
##  [667] 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1
##  [704] 1 1 2 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1
##  [741] 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1
##  [778] 1 1 2 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1
##  [815] 1 1 1 1 1 2 1 1 1 1 1 1 1 1 1 1 1 1 2 1 1 1 1 1 1 1 1 1 2 1 2 1 2 2 2 2 2
##  [852] 2 1 2 1 2 2 2 2 2 1 2 2 2 2 1 2 2 2 2 1 1 1 2 2 2 2 1 2 2 2 1 2 1 1 2 2 1
##  [889] 2 1 1 2 1 1 1 1 1 1 1 2 2 1 1 1 2 1 2 2 2 2 1 1 1 1 2 1 2 1 1 2 1 2 2 2 1
##  [926] 2 1 1 2 2 2 2 1 2 1 1 2 1 2 2 2 1 2 1 1 1 2 2 1 1 1 2 2 2 2 2 1 2 2 2 2 2
##  [963] 1 1 2 2 1 1 1 2 2 2 1 2 1 1 1 1 1 1 2 2 2 2 1 1 1 1 1 2 1 1 2 2 2 2 1 1 1
## [1000] 2 1 1 1 1 2 1 2 2 1 2 1 1 2 1 1 2 1 1 1 1 1 1 2 1 1 1 1 1 1 1 1 1 1 2 1 2
## [1037] 1 2 2 1 1 1 1 1 1 1 2 1 1 2 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1
## [1074] 1 1 2 1 1 1 1 1 1 2 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1
## [1111] 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1
## [1148] 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1
## [1185] 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1
## [1222] 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1
## [1259] 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1
## [1296] 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1
## [1333] 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1
## [1370] 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 2 1 1 2 2 1 1 1 2 2 2 2 1 2 3
## [1407] 1 2 2 1 1 1 1 2 2 2 2 2 1 2 3 2 1 1 2 1 3 2 2 2 2 1 2 2 2 1 1 1 3 3 1 2 2
## [1444] 3 1 1 2 2 2 2 1 2 2 2 3 2 1 1 2 2 3 2 2 2 1 2 2 2 1 3 2 1 2 2 2 2 1 2 3 3
## [1481] 1 3 3 1 2 3 2 3 2 2 2 1 3 2 2 3 2 2 2 2 2 2 2 2 3 2 2 2 2 2 2 1 2 2 1 2 2
## [1518] 2 2 2 2 2 3 2 1 2 2 2 2 2 1 2 2 2 2 2 2 2 1 2 1 2 2 2 2 2 2 1 2 2 2 2 2 2
## [1555] 2 2 2 2 2 2 3 2 2 1 3 2 2 2 2 2 1 2 3 2 2 2 1 2 2 2 2 2 2 2 2 2 2 1 2 2 2
## [1592] 2 2 2 2 2 1 1 2 2 2 2 3 2 1 2 2 1 1 1 2 1 2 1 2 2 1 2 1 2 2 2 2 2 1 2 2 1
## [1629] 1 2 2 2 2 2 2 1 2 2 1 2 1 3 2 2 3 2 3 2 3 3 2 3 3 3 2 3 3 3 3 2 2 2 3 2 2
## [1666] 2 3 2 2 3 2 2 3 2 2 2 2 1 2 2 3 1 2 2 2 3 1 3 2 2 2 2 2 2 3 2 3 2 3 2 3 3
## [1703] 3 2 2 3 2 2 3 2 3 2 2 3 2 3 2 2 3 2 2 1 2 2 3 2 2 3 3 3 3 2 3 2 2 2 3 2 3
## [1740] 2 2 2 1 3 1 2 3 2 3 1 3 2 3 3 3 2 3 2 3 3 3 3 2 3 2 2 1 1 2 3 3 3 2 3 3 3
## [1777] 3 3 3 2 2 3 3 2 3 3 3 2 2 3 3 1 3 2 3 3 2 3 3 2 3 2 2 2 1 3 3 3 2 2 3 3 2
## [1814] 3 2 3 3 2 1 2 2 2 2 3 1 3 3 2 1 2 1 2 1 1 1 1 2 2 2 2 1 1 2 3 2 2 2 2 1 2
## [1851] 2 3 2 1 2 1 2 3 2 3 2 3 2 1 2 2 2 1 2 1 1 1 2 3 2 2 2 2 1 1 2 1 2 1 2 2 2
## [1888] 1 2 3 2 3 2 2 3 2 2 2 3 3 2 2 2 2 2 2 3 2 1 2 2 2 2 2 2 2 1 2 2 2 2 3 2 2
## [1925] 2 2 3 3 2 2 3 3 2 3 3 2 2 2 3 3 2 2 2 1 2 1 2 3 1 2 2 3 2 2 1 3 1 2 1 3 2
## [1962] 2 1 3 2 3 3 2 2 2 2 3 2 2 2 2 2 1 1 2 2 2 2 2 2 2 2 2 2 3 2 2 2 2 2 2 2 3
## [1999] 3 2 2 2 2 2 2 2 2 3 2 1 3 2 2 2 2 3 2 2 2 2 3 2 2 2 3 2 2 2 1 2 2 2 2 2 3
## [2036] 3 2 2 2 2 2 3 3 2 2 2 1 2 3 2 2 2 2 2 2 3 2 2 2 3 3 3 3 2 2 3 2 2 3 2 2 2
## [2073] 2 2 3 2 2 2 3 2 2 2 2 3 2 2 3 3 2 2 3 2 2 2 2 2 2 2 3 2 3 3 2 2 2 3 1 2 2
## [2110] 2 2 2 2 3 3 2 2 2 2 3 2 2 2 1 2 2 3 2 2 2 2 3 2 3 2 1 3 3 2 3 3 2 2 2 2 2
## [2147] 3 2 2 2 2 2 2 3 2 2 2 1 2 3 2 2 2 2 2 3 2 2 2 2 2 2 3 3 2 2 2 3 3 3 2 2 2
## [2184] 2 2 2 2 1 2 2 3 1 1 2 2 2 1 3 2 2 2 2 2 2 2 3 2 2 1 2 2 2 2 2 2 2 3 2 2 2
## [2221] 2 2 2 2 3 2 2 3 2 1 3 2 2 2 3 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 1 2
## [2258] 2 2 2 1 1 1 1 1 1 2 1 2 2 1 2 2 2 2 1 2 1 2 1 2 2 2 2 1 2 2 2 1 1 1 2 2 1
## [2295] 2 2 2 1 2 1 2 2 1 2 2 2 2 1 1 2 1 2 2 1 1 2 2 2 2 2 2 2 1 1 2 2 1 2 1 2 2
## [2332] 1 2 1 1 2 1 1 1 1 1 2 1 1 1 1 1 2 2 2 2 2 1 1 2 2 2 1 2 2 2 2 2 1 2 2 2 1
## [2369] 1 1 2 2 2 2 2 2 2 1 2 1 2 1 2 2 1 2 1 2 1 2 2 2 2 2 1 1 2 1 2 1 1 2 2 2 1
## [2406] 2 2 2 2 2 2 2 2 2 2 1 1 2 2 2 2 2 2 1 1 1 1 1 2 1 2 1 2 1 1 1 2 2 2 1 1 2
## [2443] 2 2 1 1 2 1 2 1 2 1 1 2 1 2 2 2 1 2 2 1 1 2 2 2 2 2 1 1 2 2 2 1 2 1 2 2 2
## [2480] 2 2 2 2 1 2 2 1 1 2 2 1 1 2 2 2 2 2 2 1 2 1 2 2 1 1 1 2 2 1 1 2 1 1 1 2 2
## [2517] 2 1 1 2 1 2 1 2 2 2 2 1 2 2 2 2 2 1 1 1 1 1 2 2 1 1 1 2 1 1 2 1 1 1 1 1 1
## [2554] 2 1 1 2 1 2 2 1 2 1 1 2 2 1 2 1 1 1 1 1 2 1 1 1 2 1 2 1 1 2 1 2 2 2 2 2 2
## [2591] 1 1 2 2 1 1 1 1 2 1 1 2 1 1 2 2 2 1 2 1 1 1 1 1 1 2 1 2 1 1 2 2 1 2 2 1 1
## [2628] 2 1 2 2 1 1 1 2 2 1 1 1 2 2 1 2 2 1 2 1 1 2 2 1 1 1 2 1 1 1 1 1 1 2 1 1 2
## [2665] 1 1 1 2 1 1 1 1 1 1 1 1 2 1 2 1 1 1 2 2 1 1 1 1 2 2 1 2 2 1 2 1 1 2 1 1 1
## [2702] 1 2 1 2 1 1 2 1 2 2 2 2 2 3 2 3 2 1 1 3 2 2 2 1 2 1 3 3 1 1 2 1 1 1 2 2 3
## [2739] 3 2 3 1 2 2 1 1 1 2 2 1 2 1 1 2 1 3 1 3 2 3 3 2 2 1 1 1 1 3 2 3 2 2 1 2 3
## [2776] 1 2 3 2 2 1 2 3 1 1 2 2 2 1 1 1 1 1 3 2 2 1 2 1 2 1 3 2 2 2 3 3 1 2 3 1 1
## [2813] 2 1 2 1 1 1 2 3 3 1 2 2 3 3 2 2 3 1 1 2 1 2 1 2 2 2 2 2 1 3 2 1 2 1 1 2 2
## [2850] 1 2 1 3 2 2 2 2 1 1 3 2 2 3 2 2 1 1 2 2 2 1 2 2 2 2 1 1 1 3 1 2 1 2 2 1 3
## [2887] 2 1 3 2 2 2 2 1 1 2 1 2 2 1 1 2 1 1 2 1 1 2 1 2 2 1 1 1 3 1 2 2 1 2 1 2 2
## [2924] 2 1 2 2 1 2 2 2 2 1 2 2 1 2 1 1 1 2 1 2 2 1 1 1 1 2 2 2 1 1 1 1 1 1 2 1 2
## [2961] 2 1 1 1 1 2 2 2 3 1 2 3 2 2 2 1 1 1 2 1 1 2 2 1 3 2 1 1 1 2 1 2 1 3 1 2 2
## [2998] 1 2 3 1 1 1 1 1 2 1 1 1 1 1 1 1 1 1 1 1 1 2 1 2 1 1 1 2 1 1 1 3 1 1 1 1 1
## [3035] 2 1 1 1 2 1 2 1 1 1 1 1 1 1 1 2 3 2 3 3 3 2 3 3 3 3 3 3 2 2 2 2 3 2 2 3 2
## [3072] 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 3 2 2 2 2 2 3 3 2 3 2 2 2 2 2 2
## [3109] 2 2 2 2 2 2 3 2 2 2 2 2 2 2 3 2 2 2 2 3 2 2 2 2 2 2 2 2 2 2 2 3 2 2 2 2 2
## [3146] 2 3 2 2 2 2 3 2 2 2 2 2 2 2 3 3 2 2 2 2 2 2 3 2 2 2 2 2 2 2 2 2 2 2 3 3 2
## [3183] 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 3 2 2 2 2 2 2 2 3 2 2 2 2 2 2 2 2 2 2 2 2
## [3220] 2 2 2 3 3 2 2 2 2 2 2 2 2 2 2 2 1 2 3 2 2 3 3 2 2 2 2 3 3 3 2 2 3 2 2 2 2
## [3257] 1 2 2 2 2 2 2 2 3 2 2 2 3 2 2 3 2 2 2 2 2 2 2 2 2 2 2 3 2 3 2 2 2 2 2 2 2
## [3294] 2 2 3 2 2 3 2 2 3 2 2 2 2 2 2 2 2 3 3 2 2 3 3 2 2 3 2 3 2 3 3 2 3 3 3 3 2
## [3331] 2 2 2 2 2 3 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 3 3 2 2 2 3 2 2 2 2 2
## [3368] 2 2 2 3 3 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3
## [3405] 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3
## [3442] 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3
## [3479] 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3
## [3516] 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3
## [3553] 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3
## [3590] 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3
## [3627] 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3
## [3664] 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 2 2 2 2 2 2 2 2 1 1 2
## [3701] 1 1 1 2 1 1 1 1 1 1 2 2 1 1 1 1 1 2 1 2 2 1 1 1 1 1 1 2 1 2 1 1 1 1 2 1 2
## [3738] 1 1 2 1 1 1 1 1 1 1 1 1 1 2 2 2 2 1 1 1 1 1 1 1 2 1 1 1 2 1 2 2 1 1 2 2 2
## [3775] 1 1 1 2 1 1 1 1 1 1 1 1 1 1 1 1 1 1 2 1 1 1 1 1 2 1 1 1 1 1 1 2 1 1 1 1 1
## [3812] 1 1 1 1 1 1 1 1 2 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1
## [3849] 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 2 1 1 1 2 1 1 1 1 1 1 1 1 1 2 1 1 1 1
## [3886] 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1
## [3923] 1 1 1 1 1 1 1 1 1 1 1 1 1 1 2 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 2 2
## [3960] 2 2 2 3 2 2 2 2 2 2 2 3 2 2 2 2 2 2 3 2 2 2 2 2 3 2 2 2 2 2 2 2 2 2 2 2 2
## [3997] 2 2 2 2 2 3 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 3 2 2 2
## [4034] 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2
## [4071] 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2
## [4108] 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2
## [4145] 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2
## [4182] 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2
## [4219] 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2
## [4256] 2 2 2 2 2 2 2 2 2 2 2 2 2 1 2 2 2 2 1 2 1 2 2 2 2 2 2 2 2 2 2 1 1 2 2 2 2
## [4293] 2 2 1 1 2 2 2 1 2 1 2 2 1 2 2 2 2 2 2 2 2 2 2 2 1 2 2 2 1 2 1 2 2 2 2 2 1
## [4330] 2 2 2 2 2 2 1 2 2 2 2 2 2 1 2 2 1 1 2 1 1 1 2 1 1 2 2 2 2 2 2 2 1 1 1 1 1
## [4367] 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 2 1 2 2 2 2 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1
## [4404] 1 2 1 1 1 1 1 2 1 2 1 1 1 2 1 1 2 1 1 1 1 2 1 2 1 1 1 1 1 1 1 1 1 1 1 2 2
## [4441] 1 1 1 1 1 2 1 1 1 1 1 2 1 1 1 1 1 1 1 1 1 1 2 1 2 1 2 2 2 1 1 2 1 1 1 1 2
## [4478] 2 1 1 2 1 1 2 2 2 2 2 2 1 1 2 1 2 1 1 1 1 1 1 2 2 2 2 2 2 1 1 1 2 2 2 1 1
## [4515] 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 2 1 1 1 1 1 2 1 1 2 1 1 1 2 1 1
## [4552] 1 1 1 1 1 1 2 2 1 1 1 1 2 2 2 2 1 1 2 2 1 2 2 1 1 1 1 2 2 1 1 1 1 1 2 1 2
## [4589] 1 1 2 1 1 1 1 1 2 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 2 1 1 1 1 1 1 1 2 1
## [4626] 1 1 1 1 1 1 1 1 2 1 1 1 1 1 1 1 2 1 2 1 1 2 2 2 2 2 2 2 2 2 2 1 2 2 2 2 2
## [4663] 2 1 2 1 2 2 2 1 2 2 2 1 2 2 2 1 1 1 1 1 1 1 2 2 2 2 2 2 2 1 2 2 2 2 1 2 2
## [4700] 2 1 2 2 2 2 2 2 2 2 2 1 1 2 2 1 2 2 2 2 1 1 2 2 2 2 1 2 2 1 1 2 2 2 2 2 2
## [4737] 2 2 1 1 1 2 2 1 1 2 1 1 1 2 2 1 2 1 1 1 1 1 2 2 1 2 1 1 1 1 1 1 1 1 1 1 1
## [4774] 1 1 1 2 1 2 2 1 1 1 1 1 2 2 1 1 1 1 2 1 1 1 2 1 2 1 1 1 1 1 1 2 1 2 2 1 1
## [4811] 1 1 1 1 2 1 2 2 1 1 1 1 2 2 2 2 2 1 1 1 1 1 1 1 2 1 1 2 1 1 2 1 1 1 1 1 1
## [4848] 1 1 1 1 1 1 1 1 1 2 2 2 1 2 1 1 2 1 2 2 1 1 1 1 2 1 2 2 1 2 2 2 2 1 1 1 1
## [4885] 2 1 1 1 2 2 1 2 2 2 2 2 1 1 2 2 2 2 2 1 1 2 1 2 1 1 2 2 1 1 1 1 1 1 2 2 2
## [4922] 1 1 1 1 1 2 1 1 1 1 2 2 1 1 1 2 1 1 2 1 2 2 1 1 2 2 1 1 2 1 1 1 1 2 1 1 1
## [4959] 1 1 2 1 2 1 1 1 1 1 2 1 1 1 2 1 2 2 1 1 1 1 1 1 2 1 1 2 1 1 1 1 1 2 1 2 2
## [4996] 2 1 1 1 1 2 1 1 1 1 2 1 1 1 1 1 1 1 1 2 1 1 1 2 1 1 2 1 2 2 1 2 1 1 2 1 1
## [5033] 1 1 1 1 1 1 1 2 1 1 1 1 1 1 1 1 1 2 1 1 2 1 2 1 1 1 1 1 1 1 1 1 1 1 1 2 2
## [5070] 1 1 1 1 1 1 1 1 1 1 1 1
#scatter plot visualization of clusters
plot(avg_spikes, col = cluster_labels, pch = 16, xlab = "Trials", ylab = "Avg Spike Counts", main = "Clustering of Avg Spike Counts")

legend("topright", legend = unique(cluster_labels), col = unique(cluster_labels), pch = 16, title = "Clusters")

#getting summaries for each cluster
c1 <- summary(avg_spikes[cluster_labels == 1])

c2<- summary(avg_spikes[cluster_labels == 2])

c3 <- summary(avg_spikes[cluster_labels == 3])

#generating every feedback type response for every trial 
feedback_types <- c()
for (i in 1:length(session)) {
     this_session <- session[[i]]
     
     feedback_types <- c(feedback_types, this_session$feedback_type)
}

#making sure cluster groupings, avg spike count and feedback type are all reflected for every single trial
feedback_types <- c()
fixed_avg_spikes <- c()
fixed_cluster_labels <- c()

for (i in 1:length(session)) {
    this_session <- session[[i]]
    
    avg_spikes_session <- avg_spikes[1:length(this_session$spks)]
    cluster_labels_session <- cluster_labels[1:length(this_session$spks)]
    feedback_types <- c(feedback_types, this_session$feedback_type)
    
    fixed_avg_spikes <- c(fixed_avg_spikes, avg_spikes_session)
    fixed_cluster_labels <- c(fixed_cluster_labels, cluster_labels_session)
}

#creating dataframe
data.set <- data.frame(avg_spikes = fixed_avg_spikes, cluster_labels = fixed_cluster_labels, feedback_types = feedback_types)

#creating logistical regression model
x <- as.matrix(data.set[, c("avg_spikes", "cluster_labels")])
y <- ifelse(data.set$feedback_types == 1, 1, 0)

model <- glmnet(x, y, family = "binomial")

#test it's accuracy levels 
pred <- predict(model, newx = x, type = "response")

predict <- ifelse(pred > 0.5, 1, 0)

accuracy <- mean(predict == y)

#Testing my model agaisnt test rds files

test.files <- list.files("~/Downloads/test", pattern = ".rds", full.names = TRUE)

test.data <- list()
for (i in 1:length(test.files)) {
  test.data[[i]] <- readRDS(test.files[i])
}

test_avg_spikes <- numeric(0)
test_cluster_labels <- numeric(0)
test_feedback_types <- numeric(0)

for (i in 1:length(test.data)) {
  this_session <- test.data[[i]]
  
  #loop over trials in current session
  for (j in 1:length(this_session$spks)) {
    spks_trial <- this_session$spks[[j]]
    total_spikes <- apply(spks_trial, 1, sum)
    avg_spike <- mean(total_spikes)
    cluster_label <- clusters$cluster[j]
    feedback_type <- this_session$feedback_type[j]
    
    test_avg_spikes <- c(test_avg_spikes, avg_spike)
    test_cluster_labels <- c(test_cluster_labels, cluster_label)
    test_feedback_types <- c(test_feedback_types, feedback_type)
  }
}

test.dataset <- data.frame(avg_spikes = test_avg_spikes, cluster_labels = test_cluster_labels, feedback_types = test_feedback_types)

x.test <- as.matrix(test.dataset[, c("avg_spikes", "cluster_labels")])
predtest <- predict(model, newx = x.test, type = "response")
predict.test <- ifelse(predtest > 0.5, 1, 0)

test_accuracy <- mean(predict.test == test.dataset$feedback_types)
test_accuracy
## [1] 0.725
#confusion matrix
conf.matrix <- table(test.dataset$cluster_labels, test.dataset$feedback_types)
misclassif <- sum(conf.matrix) - sum(diag(conf.matrix))
misclassification.rate <- misclassif / sum(conf.matrix)